Next Article in Journal
Too Busy to Read, Too Important to Ignore: How Teachers Manage to Read Work-Related Literature in Their Day-to-Day Work
Previous Article in Journal
Exploring Factors Contributing to Graduate Outcomes: Using Career Registration Methodology (CRM) to Track Students’ Employability Activities, Career Readiness, and Graduate Outcomes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating a University-Wide Digital Skills Programme: Understanding the Student Journey

by
Nabila A. S. Raji
1 and
Eleanor J. Dommett
1,2,*
1
Centre for Technology Enhanced Learning, King’s College London, London SE1 9NH, UK
2
Department of Psychology, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London SE5 8AF, UK
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(12), 1295; https://doi.org/10.3390/educsci14121295
Submission received: 28 September 2024 / Revised: 18 November 2024 / Accepted: 22 November 2024 / Published: 26 November 2024
(This article belongs to the Section Higher Education)

Abstract

:
Digital competencies are critical to success in higher education, and yet these skills are often not explicitly taught to students. We have previously designed and evaluated a university-wide digital skills programme using quantitative methods. In the current study, we aim to better understand the student experience of this programme by conducting semi-structured interviews with those completing the programme. Twelve students were interviewed, and data were thematically analysed to reveal five themes. Firstly, students defined digital competencies in line with tridimensional models but also noted that these competencies were deployed in a goal-directed fashion. Secondly, prior learning was explored, with some students noting they had received training as part of specific qualifications at school but many relying on self and peer-teaching. This fed into the third theme, which related to motivations for training in which students noted the appeal of a comprehensive programme with certification on completion but also a need to address their lack of skills or confidence and maximise their university experience. The fourth theme revealed that the student learning journey through the programme varied considerably. Online learning was perceived as having strengths and weaknesses and whilst the diversity of resources was welcomed, pacing was mixed. Finally, the data demonstrated training was impactful, both in terms of teaching and learning and the wider student experience, allowing students to be more digitally aware and proficient in all areas of digital competency. The findings of the current study indicate that there is value in offering university-wide digital skills training.

1. Introduction

Higher education (HE) has been undergoing a process of digital transformation in recent decades, during which digital technology and internet information systems have been utilised to improve teaching, learning, and administration within the sector [1]. Most universities now use some form of virtual learning environment (VLE) with a range of tools such as lecture capture and online forums [2,3]. The use of these technologies was further accelerated during the COVID-19 pandemic, and it is likely that the post-pandemic era will see greater use of blended and fully online courses, meaning higher reliance on technology [4]. Whilst various benefits of such delivery modes exist including greater flexibility [5] and better outcomes [6], these modes arguably place a greater demand on students to have sufficient digital competencies to engage in learning [7]. The effective use of digital technology by students has been found to relate to better self-efficacy [8], academic performance [8,9], connectedness, and well-being [10], even in face-to-face teaching, indicating the importance of these competencies irrespective of the mode of learning. Additionally, there is a need for university students to be digitally competent to support employability. Digital technologies have shaped, and continue to shape, the economy, driving global demand for digital competencies [11]. For example, such competencies are considered a key graduate attribute [12,13] and a focal point of the European Council, who consider them to be one of eight core competencies for lifelong learning [14] and has declared that, under the Digital Decade commitment, the EU aims for 80% of the population aged 16–74 years to have basic digital skills by 2030 [15].
Given the importance of digital competencies, it is important to be clear on exactly what is meant by digital competencies, also often referred to as digital capabilities, skills or literacies—with the terms used interchangeably. Various definitions have been developed, beginning with those related to computer literacy, advancing to the use of a range of technologies, and now incorporating functioning in digital communities through knowledge creation, sharing and communication [16], and evaluation of information [17], and extending to social issues such as privacy [18]. Digital competency frameworks tend to be multi-faceted. For example, Eshet-Alkali [19] proposed a five-faceted model consisting of photo-visual, reproduction, information, branching, and socioemotional components, all of which would allow an individual to create original work. Ng [18] built on this to create a three-component model consisting of technical, cognitive and socio-emotional components, the latter of which included topics relating to communication, safety, and privacy. Later models also took a multi-component approach with literacy, aptitude and creativity [20] or operational skills, applied competency, and thinking and awareness [21]. Across frameworks, there is recognition that digital competency should include a technical or operational component, i.e., knowing how to do something, a cognitive component, i.e., scrutinising information and skills related to creating knowledge, and an etiquette component relating to legal, ethical, and social issues [13]. Aligning with this tridimensionality, digital competency in the context of university students is defined as the knowledge, skills and attitudes required to effectively use digital technologies to evaluate, consume, and produce learning information, and to communicate and collaborate with others for learning [22,23].
One of the barriers to developing these competencies in university students is the somewhat outdated and false belief that the competencies do not need to be explicitly taught because today’s students come to university digitally equipped [24]. This belief stems from the fact that the majority of those attending university can be considered digital natives [25], individuals who have grown up in a world of ubiquitous technology and the internet. Although originally proposed to be a more digitally literate generation, significant debate has ensued [26] and the idea is widely thought to lack evidence with digital proficiency in today’s students remaining low [27,28,29,30] and considerable variation in skill level reported according to factors such as gender and socioeconomic status [31], meaning assumptions about digital competencies should be avoided. Although the evidence does not support digital nativity, this rhetoric has been extremely persistent and continues to dominate in research and practice, especially in Western thinking [32], meaning that universities may not prioritise support with digital skills. Not prioritising this support misses an opportunity because, despite not being as digitally able as initially proposed, individuals within the so-called digital native generation can identify their skill level, including areas of weakness [10,33], and designated training in digital competencies has been demonstrated to be beneficial [18,24,29]. As such, it is recommended that universities do more to support these competencies, moving beyond short training courses [33] to offer more substantive training [34].
When developing this kind of training, several approaches can be taken. For example, training could be isolated to specific periods, such as in induction weeks, which aim to support students in the transition into university [35] but are often associated with information overload [36]. Alternatively, training can be continuous across a term, or even a year of the whole degree. A continuous approach is suitable for digital competencies given that they take a considerable amount of time to develop [37], meaning they are unlikely to be successfully taught and developed in short periods. As well as the duration or timing of training, there is debate over whether transferable skills, including digital competencies, should be embedded in the core degree curriculum, or taught as a separate strand. Embedding skills has the advantage of creating a more integrated student experience [38,39] and may result in higher engagement given that research shows students do not engage well with general skills development [40,41,42,43]. However, it has also been argued that where a skill is fundamental or mandatory, as digital competencies are, university-wide training may be better, because it is cost-effective and the need spans disciplines [44].
The present study describes a qualitative evaluation of a university-wide digital skills training programme at a large UK university. The programme development and initial quantitative evaluation using Kirkpatrick’s four-area evaluation criteria [45] has been described previously [24]. That evaluation found that around 30% of students perceived a mismatch between the required digital skills for learning at university and training received prior to attending. Completion rates on the programme were high and engagement was driven by various factors, including certification and a self-identified need for help with these skills. The benefits of the training were not uniform, however, with students earlier in their degree or international and BAME students finding the programme more beneficial. Furthermore, free text survey responses revealed a range of impacts from greater confidence to awareness and efficiency. The nature of the questions, however, provided limited depth, and as such, whilst this previous evaluation provided some insights, the aim of this qualitative work was to better understand the student journey by using semi-structured interviews to produce rich qualitative data that was not possible in the quantitative evaluation.

2. The Essential Digital Skills Programme

The Essential Digital Skills (EDS) programme was developed based on a Digital Capabilities Framework created for the institution by a team of staff, including academic, student services, library staff, careers and IT staff, with additional input from a student digital education committee. The framework is available online [24] and is based on previously developed frameworks within HE [46,47,48], and therefore we anticipate that many of the components within the EDS programme will align with training and support available elsewhere in the sector. The framework comprised digital identity, well-being, learning and development, ICT proficiency and productivity, information, data and media literacy, digital creation, problem-solving, innovation, digital communication, and collaboration and participation, all of which were identified in a recent global systematic review of digital skills in higher education [49]. Although the framework aligns with sector-wide views of requirements, it is unclear exactly what universities offer in terms of training as this information is not typically featured in research papers or publicly facing websites, meaning we cannot be certain what training is available elsewhere.
The EDS programme is delivered entirely online via the institutional VLE and is advertised as requiring around 30 h of study, with phased release of content from August to November of the first term of the academic year. The content is divided into five overarching topics, each containing 3–6 activities. These activities are delivered in four sections or chapters such that the content of each chapter is designed to coincide with the wider student experience in their degree. For example, an introduction to the VLE is delivered prior to or soon after arrival, whilst using feedback comes later after some assessed work is likely to have been submitted (see Figure 1).
The intention is that students complete the programme in their first year at the university, irrespective of whether they are studying for a first degree or higher degree. All students can self-enrol in the programme, which is promoted through various communications (induction, programme, and faculty). Once enrolled, they can simply dip in and out of the programme or complete it in its entirety. Most activities (18/21) are associated with a short multiple-choice question (MCQ) quiz consisting of five questions which can be attempted up to eight times. For undergraduate and taught postgraduate students, gaining 80% on each of these quizzes allows them to qualify for Higher Education Achievement Record (HEAR) accreditation. Postgraduate research students are not eligible for HEAR accreditation. All students are eligible for a digital certificate only upon successful completion of the programme. Students who enrol on the course and choose not to complete it are not penalised in any way, as the programme is optional. The programme remains open for the full academic year, closing in August; however, those wishing to gain HEAR accreditation must complete the programme by June for this to be registered.
Previous evaluation of the programme conducted in 2021, after the first presentation revealed high completion rates with students indicating a positive reaction to training with particular emphasis on increased confidence and proficiency. As indicated above, the training appeared to have brought about moderate behaviour change, although individual differences existed according to ethnicity and student status (e.g., home, EU, or international). The programme also contributed more to the skills of students when completed earlier in their studies [24]. Whilst the findings of this early evaluation were positive, since that time, the programme has been updated with the addition of content on generative AI and completed by hundreds more students. Furthermore, the quantitative data collected previously does not permit us to fully understand the rich experiences of students as they work through the training. Therefore, the current study employed qualitative methods to address an over-arching research question of ‘How do students experience the EDS programme?’ with the specific objectives being to understand (i) their motivations to complete the training, (ii) their learning journey, and (iii) the impact of the training.

3. Methods

3.1. Design and Recruitment

This study used an inductive qualitative approach with semi-structured interviews. We opted for individual interviews rather than focus groups here because individual variation in digital capabilities is known to be high and focus groups, which assume a group discussion to identify perceptions [50], may have masked this variety and thus prevented us from understanding the student experience fully. Semi-structured interviews were chosen rather than structured or unstructured interviews because they are central to education research [51] and support a dialogue between researcher and participant that can address specific research questions and objectives, as we had here, whilst allowing the flexibility to explore unexpected comments. This was critical in this instance, given the diversity of students who could complete the programme (studying any discipline and at any degree level). Participants were purposefully sampled from those completing a university-wide training programme, the EDS programme, from different academic disciplines, year groups, and levels of study (undergraduate and postgraduate). Interviews took place online during June and August 2024, and all were conducted by the second author, who is experienced in qualitative research. The final cut-off for completing the EDS programme was June. Ethical approval was granted by the Institutional Ethical Review Committee (MRA-20/21-21928). Participants received a GBP 10.00 voucher for their time.

3.2. Procedure

Interested participants completed an online expression of interest form in which they provided key demographic data and details of their university studies as well as a contact email address for the interview to be booked. During each interview (lasting between 30 and 60 min) participants were asked to explain what they understood by digital competencies, describe their previous training and motivation for completing the EDS programme, and reflect on the training experience and its impact.

3.3. Data Analysis

Interviews were transcribed automatically through MS Teams and then manually checked and corrected, ensuring any identifiable information was removed. This process was completed by the interviewer within 48 h of the interview to ensure that contextual detail that may impact transcription was not lost [52]. Reflexive thematic analysis with an inductive approach was used to code and generate themes [53]. The six-stage analysis process involved data familiarisation, coding, thematic extraction, and review and naming of themes, before finally completing a narrative analysis [54]. Quotes are provided as evidence of findings using the participant ID number [55]. Once all transcripts were corrected, the interviewer (second author) conducted an analysis and then shared the findings, including the transcripts, with the co-researcher to review (first author) before the themes were finalized for reporting.

4. Results

Twelve students were interviewed who had completed the EDS programme. Of these, 6 identified as female, 5 as male and 1 as non-binary. They were aged 19 to 28 years (M ± SD, 25.75 ± 5.89). Ethnicities varied with four White, four Asian, one Black, one Latino and one of mixed ethnicity (both Asian/White). Eight of the students declared no disability. The remaining four identified sensory, mental health or long-term conditions, with three also reporting to be neurodivergent. The sample included students studying for undergraduate degrees (e.g., BSc, N = 5), postgraduate taught degrees (e.g., MA, N = 5), and doctorates (N = 2). Those studying for undergraduate degrees included students in years 1, 3, and 4 of their qualification. For the postgraduate taught degree, all students were in their first year, with most completing a single-year qualification. Of those studying towards a doctorate, one was in their first year and one in their third year. Seven students were home students, one was from an EU country, and the remaining four were international students. Seven faculties were represented, with students studying across academic disciplines with no two students studying for the same degree. Areas of study were health-related (N = 4), finance (N = 2), policy and security (N = 3), humanities (N = 2), and education (N = 1). Students reported completion of the EDS programme at various points in the year, with around half reporting that they completed it just before the final cut-off in June, and therefore within 2 months of the interview, whilst the remainder completed earlier in the year, with the earliest completion around January.
Five themes were constructed, each with several subthemes: (1) defining digital competency; (2) prior learning; (3) motives for training; (4) learning journey; and (5) impact of training. These are summarised in Table 1. The themes did not all clearly relate to one another. For example, definitions given did not link to the amount of prior learning but prior learning did sometimes impact motivations and impact.

4.1. Defining Digital Competency

Participants gave a range of definitions for digital competency, acknowledging “it’s a broad church” (P3) and aligning to a tridimensional approach. Many focused on operational components, i.e., knowing how to use specific tools, most notably Microsoft Office and Google with digital competency defined as “related to like technical kind of thing including the Microsoft Office, like word how to do a, how to write or type in a Word document, or do kind of Excel kind calculations. So, to create a table, kind of do a graph. Then, like, do an email like put a mail, then yeah. Sort of like how to do a Google search” (P4). However, cognitive components were also identified such as “navigating new sources, seeing what’s trustworthy, seeing what’s not” (P11), along with those relating to etiquette with reference to “ethical issues, communication issues” (P1). In some cases, participants noted that it was not just about what you can do now but also in the future: “being able to keep up with […] changes” (P12) in the tools that are being used. As well as recognising the breadth of competency, participants noted that the use of digital tools had to be to achieve a specific purpose, such that digital competencies were about using “information technology to meet [a] goal or aim” (P3). Within this, the goal could be personal, study, or career-related. It is noteworthy that the definitions provided did not appear to vary with level, discipline, or study.

4.2. Prior Learning

Prior learning of digital competencies varied considerably, and three subthemes emerged. Firstly, some students had experienced formal training previously, but this tended to be within specific qualifications. For example, a student who had completed an Extended Project Qualification (EPQ) alongside their A-levels reported learning about database searching: “I did learn a little bit about, I think it was called Boolean logic which is you know how to search for certain articles cause it’s like hundreds or thousands” (P11). Another who had completed a GCSE in Computer Science indicated that they had been “trained in the basics at school” (P7) as part of this GCSE.
Secondly, most students indicated that much of their digital competency had been self-taught using ‘just in time learning’ where they had searched online, typically using Google or YouTube to find out how to do something as they were doing it: “Often things like YouTube videos was a commonplace where I would find and still do use” (P3); “I would search on them like in YouTube or Google and well like read some kind of information the Google how they are done like in a basic knowledge.” (P4).
Finally, students also reported learning from peers, often by simply being around them: “so much of it is peer learnt within digital skills, certainly within my experience. So, I think being around other people and just seeing their work and seeing their ideas is quite helpful” (P3). The latter subtheme was more common in students who had been at university longer, presumably because they had an established peer group to learn from.

4.3. Motives for Training

Students reported a range of motivations for wanting to complete the optional university-wide EDS programme. Firstly, motivation came from a fear or lack of confidence in working digitally, especially given the rapid speed of change in technology: “I am not good at digital technologies and digital technologies have been improved so quickly, so I’m always confused and struggling to catch up” (P1). Secondly, students reported being motivated by identifying skills that they felt they were lacking. These could be specific or general. For example, students identified specific skills useful for assessment: “there is, was a section on plagiarism, if I’m not wrong, yeah, that’s the main thing that I was interested in because I just wanted to be sure that I don’t, uh, plagiarise” (P8). General skill shortages were also noted, particularly for students coming from outside the UK: “Everything is more of like a software-based [than my home country]. So, I found that like useful if I know more about how to do this or how these are working or how is the background of this things, how they are working. Basically, so that’s why I meant to more interesting in this module.” (P4); “Well, I come from [another country], so I thought that my tech skills were not very well developed. So, I thought that by doing that course I could potentially improve my skills and especially get used to all these technological interfaces that I didn’t have access to, I didn’t have access to when I was studying my undergrad in my country.” (P6). The comments from international students were similar irrespective of whether they were studying at undergraduate or postgraduate level, indicating that even those completing a first degree outside of the UK felt less equipped.
Thirdly, students noted the appeal of a clear and comprehensive curriculum, possibly in contrast to their self-taught prior experiences: “I would say the fact of having a curriculum […] I like everything to be ordered in my head, so I like to start really step by step and not like jumping from one section to another one but starting from the beginning and then progressing towards” (P2); “the programme is first very comprehensive” (P6). Fourthly, the availability of certification on completion appealed, often because it could be used when applying for jobs: “The biggest factor for me was the fact that I would get a little certificate at the end that said, I have got essential digital skills and I think I’m very used to saying on my CV” (P3). Finally, students noted that completing the course was about maximising their university experience: “I wanted to be involved in as many things as possible and kind of make the most of my experience back at uni really” (P10); “so every little thing that the university offers on all the different levels is another, I guess it’s such a perk. And so, the fact that this Essential Digital Skills course existed, and I didn’t pay for it. OK, fine. I’m paying my fees, but that’s irrelevant—it’s a perk.” (P7). The appeal of the clear curriculum, certification and maximisation of the university experience was present for all levels of students, even those with existing careers (e.g., a medical doctor, completing a PhD).

4.4. Learning Journey

As might be expected for an online, self-paced course that is optional, the approach taken and experience varied, with several subthemes emerging. Firstly, online delivery was recognised as a double-edged sword by all levels of students. All students appreciated the online delivery, typically citing not having to travel or fit around specific class times as an advantage: “I think if, I’d known that I’d need to be committed at, you know, a certain time or certain period, I think I probably wouldn’t have done it. But just because I have this, you know it’s accessible whenever I have time or whenever I’m able to. That’s, you know, maybe I thought, OK, I can do it.” (P8); “Yeah, because I know how difficult it would be to get, to get time constraints with the course and I mean sometimes generally, I mean the session could be set and then you possibly have a class that is clashing around the same time or you’re staying very far away from [the campus] and then you say “Must I really move to [the campus] for this?” (P5). However, despite students enjoying the flexibility of online delivery, several suggested additional optional in-person sessions could be beneficial: “So maybe it’s nice to be next to someone to do it, because I like human contact. So that’s something as well.” (P2). Related to this, some found the amount of content overwhelming in places: “Where you can’t physically see your own progress as you’re sort of turning pages erm, it can feel quite daunting” (P3); “I think for the longer parts it could be a bit overwhelming” (P9). Despite this it was recognised that the sense of overwhelm could give way to curiosity in some cases: “You get overwhelmed in a sense, but if you find something that you like and then you’re able to function and dig in and find and read up more and get more clarity and take advantage of the resources” (P5).
Secondly, all students valued the diversity of resources: “I can keep focusing on just reading or watching video. The multiple resources triangulated my focus and knowledge” (P1). They also considered the mix of video, text and quizzes to be “super healthy” (P10). Thirdly, the use of continuous assessment, in the form of MCQ quizzes at the end of each block was considered helpful to monitor progress, “I thought it was a good way of just testing your knowledge, but not to like a massive extent that it could potentially turn people away from the course.” (P11), and was seen as necessary, “assessment would be necessary and informative because we could reflect on what we learned, or sometimes I missed some important information, but the quiz reminded me that” (P1). However, some students would have appreciated more diversity of question types, particularly problem-based assessment (“making scenario problem-based ethical questions I think could fit really well”, (P12)), and more feedback (“adjust some of the assessment to give more feedback when there’s mistakes so people would know where to go back” (P2)).
Thirdly, the programme was often set aside for periods of time resulting in a varying journey to completion. This was not due to a lack of time, with students taking a range of approaches, varying from a ‘little and often approach’, “I was like, OK, I’m gonna start doing like a little bit every day, and just you have to just keep consistent. And yes, I do work full time and I am studying as well but as long as you do something consistently a little bit every day, you kind of get through it really […] They give you plenty of time and you just have to do a little bit every day” (P10), to a ‘binge approach’, “So I did about 80% of it in in an evening for four hours and then did it” (P7). Rather, this was because it was not a priority: “So, it couldn’t be a priority because I guess your PhD is your priority and also having time away from work should be a priority, so it was it was there, but it you managed to do it, but it wouldn’t have been at the top of the list” (P8). Additionally, once training had slipped aside, students often found it tricky to pick up again after a period of absence: “When I take time between the programme, I got lost where to start” (P1). Many re-started the programme when the deadline for completion reminder was sent: “I put it aside for a while in the middle and then I finished it when the deadline was informed” (P1), “you know, made me finish it up because I was reminded that the deadline is coming soon” (P8).

4.5. Impact of Training

Perhaps unsurprisingly given that only those completing the EDS programme were interviewed, all students reported finding the course helpful. However, within this, several subthemes emerged. Firstly, students reported an impact on their teaching and learning experiences, i.e., it had impacted their degree studies. This was typically in terms of helping them understand and produce assessed content: “I think it can be very overwhelming to be set an assignment, which is described in quite a vague way, and to think what on earth am I going to do to try to tackle that and how am I going to care about approaching that? But what the Digital Skills programme has done is it’s really sort of very neatly summarized everything. And even though I don’t feel like an expert on all of the things, it’s covered well, I do feel like, is, I know a little bit more about what I know. It’s the known knowns and the known unknowns as well.” (P3); “In my assignment it helped me doing my academic kind of or things regarding my Turnitin, checking my Turnitin score” (P4); “I recognize that if you can use the library or other resources in a more effective way that could then be useful when building up the bank of information required before one takes an essay or an assessment of some kind” (P7); “I think in the referencing side and the resources, that’s what I, that’s what I have learned” (P10).
Secondly, they reported that the programme had raised their awareness of what digital tools were available and their proficiency in using them, which in turn could benefit their studies: “Like, oh, we have a free like a software subscription for this kind of software or [the university] provide this kind of services after doing this module” (P4); “I could actually use all the tools that the university provides because, sometimes I mean, of course you learn how to use [the institutional VLE] for example, but you don’t know all the like all the functions” (P6). Additionally, awareness of wider issues was raised, such as GDPR or data privacy.
Thirdly, students reported benefits in terms of being able to communicate and have a stronger positive digital presence. In some cases, these related to their studies, but in others, it was related to extra-curricular activities: “this might be…sound quite small, but after looking at how to write emails I am writing my emails well, when I send them off to teachers, to students and actually at the moment I am a student rep, so the communication has improved a lot. […] So, for example, before I wasn’t very active on LinkedIn and then after the course and I was like oh maybe I need to invest more in my digital presence” (P10). The impact of this was also considered to increase connections with others: “one of the bigger outcomes was that I’m now, I’m more active on social media […] the fact that, like I now post a bit more on social media, I now know a bit more about it. It’s helped me connect with other my, my friends and other King students in the way I wouldn’t have done before. And I’m seeing things now. Now I’m more active on Instagram and LinkedIn is that I’m seeing more of what other students are doing, and it’s also in terms of finding just small things like events and society stuff.” (P11).
Finally, students noted that the programme had impacted their wider student experience. For example, students reported that they felt supported by the university: “I have felt that the university provides a lot of resources and that actually the university is concerned or makes sure that they provide you like a lot of value in different ways” (P6), and that some of the content helped with extracurricular roles, “I’ve become a treasurer for a society and we’re doing community leaders training right now. And one of the areas that we’re learning about is about the GDPR, and I think it was nice to have a sort of foundational knowledge about it beforehand” (P11). Additionally, the programme gave students a sense of progress and respite from their degree studies which was a positive experience for them outside of their degree work: “Doing the, doing the digital skills, was almost the respite […] because I could very clearly see my progress. You can tick little boxes to see it, whereas when you’re doing in a systematic review and you’re thinking, OK, I’m a week in and I’ve got no further, it feels I was a week ago, or four months ago in my case, then you’re left thinking it’s hard to get that sort of satisfied. So, it did leave me […] feeling ok. OK, I did do something good today, so I quite liked that” (P3).

5. Discussion

The aim of the present study was to better understand the experiences of students completing a university-wide digital skills programme at a large London university. Whilst previous quantitative evaluation had found positive effects of training, the use of semi-structured interviews allowed greater exploration of experiences.

5.1. Student Perceptions of Digital Competencies and Training

Several themes emerged from the data. Firstly, after completing the programme, students defined digital competencies in line with existing frameworks, adopting a multi-faceted approach encompassing operational, cognitive and etiquette components [13], although the majority focused on step-by-step use and therefore operational elements. Irrespective of how digital competencies were defined, students saw them as goal-directed behaviour, typically linking them to studies but also social and career-related activities. Secondly, whilst some students reported receiving specific prior training in digital skills, this was only if they had undertaken certain qualifications, such as the EPQ, which has previously been shown to assist in developing digital skills, specifically in the cognitive dimension [56], and is associated with a greater likelihood of entering higher education and better outcomes [57]. Most students discussed how they had taught themselves, through trial and error, to use new devices and complete certain activities. The use of trial-and-error learning aligns with Prensky’s original idea of digital natives [25], as he identified that this generation did not use manuals but rather worked things out—something also referred to as ‘Nintendo over logic’ [58]. However, it is important to acknowledge that this approach to learning far pre-dates the idea of digital natives, as it aligns with experiential learning discussed by the likes of Piaget, Vygotsky, and Socrates [59]. Whilst the methods of self-teaching identified here rely on modern tools like Google and YouTube, the principles are not new, and YouTube in particular is increasingly recognised as a learning platform [60]. Students also reported learning from peers, something previously found to be helpful in skill development [61].
Based on the ages of the cohort interviewed, all would be classed as so-called digital natives [25], and yet some of the motivations for completing the training revealed students were not fully comfortable in the digital world, in line with previous research showing they could identify weaknesses in this area [10,33]. They cited fear of digital tools and a lack of confidence, as well as a lack of skills. This was particularly prominent in those who had completed previous study outside of the UK and so had never submitted work electronically or used an online learning platform. This aligns with the previous work noting a digital divide globally [62] and the previous evaluation which showed students from outside of the UK felt a greater impact of training [24]. Other motives included the appeal of a clear and comprehensive curriculum and certification of completion. Student comments indicated that they recognised the limitations of their self-taught approach, which may explain the appeal of the training programme. Furthermore, certification was previously found to be a key factor in completing the course [24] with students in the current study stating that it can be used for evidence when applying for jobs. An additional motivation stemmed from wanting to make the most out of their university experience and, within this, gain as much value as possible. The idea of enhancing the student experience by offering a range of extra services and activities is widely acknowledged as a valuable activity by universities [63].
The student journey through the training was variable. Whilst most students appreciated the online delivery, some found this overwhelming at times, although the diversity of resources and regular MCQ quizzes were deemed helpful for maintaining engagement. The main reasons given in favour of online delivery were the convenience of not having to travel to campus, aligning with previous advantages identified for digital tools in online platforms [64], and being able to self-pace learning [5]. Similarly, the preference for multiple types of resources, including video [65] and regular, low-stakes assessment [66], has been previously reported. There was considerable variation in how students worked through material, with some taking a ‘little and often’ approach, whilst others took a ‘binge approach’, working through large chunks at a time, despite the staggered release of material. Although the programme was designed with a ‘little and often’ approach in mind, binge learning has been previously found in online learning [67]. The reasons for binge learning identified in the literature are varied and include convenience [67], preference for massed presentation of information [68], or simply greater enjoyment from the fluency and flow of learning [68]. However, this binge approach may not have the benefit of spaced learning, with research showing that information presented at different points in time can yield better outcomes [67]. The benefits of spaced, rather than binge learning, can be explained by cognitive load theory, which attempts to explain how people learn and store new information by considering the amount of information that working memory can process at any given time [48]. Research shows that following binge (or massed) learning, working memory capacity is reduced and cognitive load is higher, which is interpreted as a depletion of working memory resources in comparison to spaced learning. Such a depletion can result in poorer storage of information [69].
Within the current study, students identified both front-binger behaviours, i.e., accessing all content as early as possible, and back-binging, i.e., doing it all at the end [67]. The reasoning for the different behaviours varied. For example, one PhD student thought it was a good use of their time whilst waiting for their PhD to gain momentum, while another student reported binging before the deadline for completion. Further research should examine whether particular student characteristics predict different paces of study and outcomes of learning. Despite the variation in when students completed the programme, several commented that it was best completed earlier to gain benefits. Future research could look more carefully at the impact of spaced learning, the timing of the overall programme, and back and front binging patterns on outcomes.
Despite the varied journeys through training, students generally reported a positive impact. The impact spanned their teaching and learning experiences, typically relating to assessment and greater awareness of what digital tools were available within the university. Interestingly, the impact was noted across the dimensions of digital competencies, with students feeling more proficient and better able to find suitable information and communicate effectively, indicating that the training had fulfilled its goal. Perhaps a more surprising aspect of the impact was around the wider student experience. Students reported feeling more connected to others, something increasingly being recognised as critical at a time when student mental health difficulties and loneliness are deemed at crisis point and necessitating a whole-university approach [70]. Additionally, there were reports of the programme helping students with their extracurricular activities, such as student societies, which are known to foster integration and reduce attrition [71] or give a sense of progress when other activities were slow to come to fruition. An impact of this kind suggests that digital skills training can play a part in the whole-university approach to student mental health and help foster a supportive community in HE.
Whilst the results of the current study relate to a specific digital competencies programme delivered at just one university, and therefore, generalization must be carried out cautiously, the findings of the present study along with the previous evaluation [24] provide some insights into how universities might approach this kind of training. Based on the current findings, we tentatively suggest that universities should invest in developing university-wide digital capabilities training to support teaching and learning as well as the wider student experience. Furthermore, we would recommend several features be considered. Firstly, such training would begin with an assessment, possibly self-assessment, of a student’s current skill level to help them access the training they need and recognise the diversity of skills on arrival in Higher Education. This would create a sense of personalised learning. Secondly, consideration should be given to the characteristics of the student body. In the current university, there is a high proportion of international students and data here and from the previous evaluation [24] suggests that they may particularly benefit from the training. Each university will need to consider its student body carefully. Thirdly, incentivising completion with certification or credit points may increase engagement and so be a worthwhile endeavour, along with ensuring students have the capacity for the extra training by paying close attention to overall workload. Finally, when developing such programmes, it is important to ensure an engaging and diverse learning experience, and one that can be studied at different paces by different students, whilst facilitating a spaced learning approach where possible. Although supporting flexibility will likely engage more students, encouraging spaced rather than binge-learning behaviours is more likely to have better outcomes in terms of the skills learnt [69].

5.2. Limitations

Whilst the findings of the current study provide novel insights into student experiences of online university-wide digital competencies training through the recruitment of a diverse group of students and the use of qualitative methods, the study is not without limitations. Firstly, the study is, in effect, a case study which evaluates an online digital skills training programme for students at a university where most teaching is in-person, complemented by the use of a VLE. Students learning in a different mode may have different training requirements, motivations, and approaches. Secondly, we only interviewed students completing the EDS programme, and future research might also investigate the experiences of students who were unable to complete the programme. Thirdly, although we reached saturation in our data and the sample was diverse in that we heard from students across a range of genders, ethnicities, levels of study, and disciplines, the overall sample size was small. The sample size aligned with guidance [72] but the small size could impact the reliability and generalizability of the findings. Whilst the relationship between qualitative research and generalization is a controversial one [73,74], our aim was to gain a richer understanding of the student journey through the EDS programme and, therefore, to be able to internally generalize. That is, we hoped to be able to generalize within the setting and population, rather than to other training programmes or to non-students [73]. It should therefore be noted that the smaller sample size may have impacted this and future work should consider further evaluation, building on the present study. For example, it has been suggested that a short questionnaire applied to other students may help generalize findings [72]. Fourthly, and related to the point of generalization, the study focused on a specific digital competency training programme offered at one university, and as such the same experiences reported here may not be found elsewhere. We cannot externally generalize these findings, although the themes that have emerged may help others develop topic guides or surveys to evaluate their own programmes. Fifthly, we could have taken more steps to ensure that our analysis of the data was unbiased. Due to the timing of the interviews at the end of the academic year leading into the summer break, we did not utilize member checking, which could have reduced bias and increased the trustworthiness of the data [75]. This study was designed to build on a previous quantitative evaluation of the EDS programme, but within this study, we could have made use of triangulation to support our interpretation. For example, the patterns of completion could have been supported by the use of analytics data from the virtual learning environment where the EDS programme is hosted. This would have strengthened the findings. Additionally, although the transcripts and analysis were reviewed by a researcher who had not completed the interviews, that researcher was not impartial, having been involved in the development of the programme, and therefore incorporating full peer debriefing would have strengthened the work considerably. Sixthly, this study was conducted within six months of students completing the EDS programme, and some had only completed the training within the month prior to interviews commencing. Given that the full benefit of digital skills training may not be recognised until a student experiences the demands of the workforce [76], it is likely that longer-term effects on employability and lifelong learning will not have been uncovered in this research and should be considered in future studies. Finally, due to the nature of the study design, we did not interview students before and after training. Therefore, we cannot make conclusions about whether training impacted their definitions of digital competency but rather only their motivations to train, how they approached it, and the impact. Related to this, the study did not reveal specific relationships between themes, and this should be considered in future research.

6. Conclusions and Recommendations

Based on the findings of the current study, students can benefit from online university-wide digital competency training, i.e., training that is not embedded within their core curriculum, to support in-person teaching that is complemented by the use of VLEs and other digital technologies. This may offer a cost-effective approach to supporting students in their teaching and learning, as well as wider endeavours including employability and social connectedness. Such training may be particularly important for specific subsets of students, and as such careful promotion to those students may be helpful. For example, students coming into a new country to study from countries where digital platforms are less utilised in education settings. These students may have developed different digital skills and be less competent or confident in working digitally for education purposes. Similarly, students who have not had the opportunity to complete specific previous qualifications such as the EPQ or computer science courses may also particularly benefit. Our findings highlighted that motivations for undertaking this online training vary but students tended to be motivated by recognition of limited skills or confidence, as well as the availability of certification and a clear curriculum, suggesting these factors may be important in promoting any digital competency programme. Where training is optional and separated from the main curriculum, it is important to recognise that it is unlikely to be prioritised by students continuously, which may result in a varied learning journey. As such, regular reminders and easy ways to track progress should be made available. Whilst online learning was generally well received with praise for the diversity of resources and low-stakes assessment, opportunities for in-person interactions may also be beneficial.

Author Contributions

Conceptualization, N.A.S.R. and E.J.D.; methodology, N.A.S.R. and E.J.D.; Date collection, E.J.D.; formal analysis, E.J.D.; writing—original draft preparation, E.J.D.; writing—review and editing, N.A.S.R. and E.J.D. All authors have read and agreed to the published version of the manuscript.

Funding

The APC was funded by King’s College London.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of King’s College London (protocol code MRA-20/21-21928 and date approved 22 January 2021) for studies involving humans.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. Written informed consent has been obtained from the participant(s) to publish this paper.

Data Availability Statement

Anonymised transcripts are available on reasonable request to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Xiao, J. Digital Transformation in Higher Education: Critiquing the Five-year Development Plans (2016–2020) of 75 Chinese Universities. Distance Educ. 2019, 40, 515–533. [Google Scholar] [CrossRef]
  2. Dommett, E.J. Understanding student use of twitter and online forums in higher education. Educ. Inf. Technol. 2019, 24, 325–343. [Google Scholar] [CrossRef]
  3. Dommett, E.J.; Gardner, B.; Van Tilburg, W. Staff and student views of lecture capture: A qualitative study. Int. J. Educ. Technol. High. Educ. 2019, 16, 15. [Google Scholar] [CrossRef]
  4. Guppy, N.; Verpoorten, D.; Boud, D.; Lin, L.; Tai, J.; Bartolic, S. The post-COVID-19 future of digital learning in higher education: Views from educators, students, and other professionals in six countries. Br. J. Educ. Technol. 2022, 53, 1750–1765. [Google Scholar] [CrossRef]
  5. Linder, K.E.; Kelly, K. The Blended Course Design Workbook: A Practical Guide; Taylor & Francis: Abingdon, UK, 2024. [Google Scholar]
  6. Smith, K.; Hill, J. Defining the nature of blended learning through its depiction in current research. High. Educ. Res. Dev. 2019, 38, 383–397. [Google Scholar] [CrossRef]
  7. Bowyer, J.; Chambers, L. Evaluating Blended Learning: Bringing the Elements Together; University of Cambridge: Cambridge, UK, 2017. [Google Scholar]
  8. Ibrahim, R.K.; Aldawsari, A.N. Relationship between digital capabilities and academic performance: The mediating effect of self-efficacy. BMC Nurs. 2023, 22, 434. [Google Scholar] [CrossRef]
  9. Tekleselase, H. Usage of Digital Technology in Higher Education: Teacher and Student Digital Competency. J. Electr. Eng. Electron. Technol. 2021, 10, 1000179. [Google Scholar]
  10. Dinu, L.M.; Byrom, N.C.; Mehta, K.J.; Everett, S.; Foster, J.L.H.; Dommett, E.J. Predicting student mental wellbeing and loneliness and the importance of digital skills. J. Furth. High. Educ. 2022, 46, 1040–1053. [Google Scholar] [CrossRef]
  11. Deloitte. ACS Australia’s Digital Pulse 2019: Booming Today, but How Can We Sustain Digital Workforce Growth? 2019. Available online: https://www.acs.org.au/insightsandpublications/reports-publications/digital-pulse-2019.html (accessed on 20 September 2024).
  12. Sharpe, R. (Ed.) Digital Literacy: From a Definition to a Graduate Attribute to a Measure of Learning Gain 2018: Queen’s Learning and Teaching Conference; Queen’s University Belfast: Belfast, UK, 2018. [Google Scholar]
  13. Morgan, A.; Sibson, R.; Jackson, D. Digital demand and digital deficit: Conceptualising digital literacy and gauging proficiency among higher education students. J. High. Educ. Policy Manag. 2022, 44, 258–275. [Google Scholar] [CrossRef]
  14. European Commission. Directorate-General for Education, Youth, Sport and Culture. In Key Competences for Lifelong Learning; EU Publications: Luxembourg, 2019. [Google Scholar] [CrossRef]
  15. European Council. Digital Skills and Competences and Successful Digital Education and Training: Fit for the Digital Era; European Council: Brussels, Belgium, 2023. [Google Scholar]
  16. List, A. Defining digital literacy development: An examination of pre-service teachers’ beliefs. Comput. Educ. 2019, 138, 146–158. [Google Scholar] [CrossRef]
  17. Mohammadyari, S.; Singh, H. Understanding the effect of e-learning on individual performance: The role of digital literacy. Comput. Educ. 2015, 82, 11–25. [Google Scholar] [CrossRef]
  18. Ng, W. Can we teach digital natives digital literacy? Comput. Educ. 2012, 59, 1065–1078. [Google Scholar] [CrossRef]
  19. Eshet, Y. Digital literacy: A conceptual framework for survival skills in the digital era. J. Educ. Multimed. Hypermedia 2004, 13, 93–106. [Google Scholar]
  20. Pérez, J.; Murray, M.C. Generativity: The new frontier for information and communication technology literacy. Interdiscip. J. Inf. 2010, 5, 127–137. [Google Scholar] [CrossRef]
  21. Wu, D. Exploring digital literacy in the era of digital civilization: A framework for college students in China. Inf. Serv. Use 2024, 44, 69–91. [Google Scholar] [CrossRef]
  22. Janssen, J.; Stoyanov, S.; Ferrari, A.; Punie, Y.; Pannekeet, K.; Sloep, P. Experts’ views on digital competence: Commonalities and differences. Comput. Educ. 2013, 68, 473–481. [Google Scholar] [CrossRef]
  23. Wang, X.; Wang, Z.; Wang, Q.; Chen, W.; Pi, Z. Supporting digitally enhanced learning through measurement in higher education: Development and validation of a university students’ digital competence scale. J. Comput. Assist. Learn. 2021, 37, 1063–1076. [Google Scholar] [CrossRef]
  24. Raji, N.A.; Busson-Crowe, D.A.; Dommett, E.J. University-Wide Digital Skills Training: A Case Study Evaluation. Educ. Sci. 2023, 13, 333. [Google Scholar] [CrossRef]
  25. Prensky, M. Digital natives, digital immigrants Part 1. Horizon 2001, 9, 1–6. [Google Scholar] [CrossRef]
  26. Judd, T. The rise and fall (?) of the digital natives. Australas. J. Educ. Technol. 2018, 34, 99–119. [Google Scholar] [CrossRef]
  27. Bond, M.; Marín, V.I.; Dolch, C.; Bedenlier, S.; Zawacki-Richter, O. Digital transformation in German higher education: Student and teacher perceptions and usage of digital media. Int. J. Educ. Technol. High. Educ. 2018, 15, 48. [Google Scholar] [CrossRef]
  28. McGrew, S.; Breakstone, J.; Ortega, T.; Smith, M.; Wineburg, S. Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory Res. Soc. Educ. 2018, 46, 165–193. [Google Scholar] [CrossRef]
  29. Borrás-Gené, O.; Serrano-Luján, L.; Díez, R.M. Professional and Academic Digital Identity Workshop for Higher Education Students. Information 2022, 13, 490. [Google Scholar] [CrossRef]
  30. Pangrazio, L. Young People’s Literacies in the Digital Age: Continuities, Conflicts and Contradictions; Routledge: London, UK, 2018. [Google Scholar]
  31. Hidalgo, A.; Gabaly, S.; Morales-Alonso, G.; Urueña, A. The digital divide in light of sustainable development: An approach through advanced machine learning techniques. Technol. Forecast. Soc. Chang. 2020, 150, 119754. [Google Scholar] [CrossRef]
  32. Mertala, P.; López-Pernas, S.; Vartiainen, H.; Saqr, M.; Tedre, M. Digital natives in the scientific literature: A topic modeling approach. Comput. Hum. Behav. 2024, 152, 108076. [Google Scholar] [CrossRef]
  33. Martzoukou, K.; Fulton, C.; Kostagiolas, P.; Lavranos, C. A study of higher education students’ self-perceived digital competences for learning and everyday life online participation. J. Doc. 2020, 76, 1413–1458. [Google Scholar] [CrossRef]
  34. Alex-Nmecha, J.C.; Ejitagha, S. An Evaluation of Digital Information Literacy Skills among Undergraduate Students of Library and Information Science in Universities in Nigeria. Mousaion 2023, 41, 1–20. [Google Scholar] [CrossRef]
  35. Onikoyi, B.; Nnamoko, N. University Induction and 1st Year Students’ Integration into Higher Education: An Exploration of Experiences. SSRN Electron. J. 2023. [Google Scholar] [CrossRef]
  36. Wakefield, J.; Grabowski, S. I’ll Be There for You: Generating Sustained Student Connectedness from the Beginning. Stud. Success. [CrossRef]
  37. Thorne, S.L. Digital literacies. In Framing Languages and Literacies: Socially Situated Views and Perspectives; Taylor & Francis: Abingdon, UK, 2013; pp. 192–218. [Google Scholar]
  38. Orr, D.; Appleton, M.; Wallin, M. Information literacy and flexible delivery: Creating a conceptual framework and model. J. Acad. Librariansh. 2001, 27, 457–463. [Google Scholar] [CrossRef]
  39. Snavely, L.; Cooper, N. The information literacy debate. J. Acad. Librariansh. 1997, 23, 9–14. [Google Scholar] [CrossRef]
  40. Burnett, S.; Collins, S. Ask the audience! Using a personal response system to enhance information literacy and induction sessions at Kingston University. J. Inf. Lit. 2007, 1, 1–3. [Google Scholar] [CrossRef]
  41. Thompson, K.; Kardos, R.; Knapp, L. From tourist to treasure hunter: A self-guided orientation programme for first-year students. Health Inf. Libr. J. 2008, 25, 69–73. [Google Scholar] [CrossRef] [PubMed]
  42. Verlander, P.; Scutt, C. Teaching information skills to large groups with limited time and resources. J. Inf. Lit. 2009, 3, 31–42. [Google Scholar] [CrossRef]
  43. Wingate, U. Doing away with ‘study skills’. Teach. High. Educ. 2006, 11, 457–469. [Google Scholar] [CrossRef]
  44. Benson, L.; Rodier, K.; Enström, R.; Bocatto, E. Developing a university-wide academic integrity E-learning tutorial: A Canadian case. Int. J. Educ. Integr. 2019, 15, 5. [Google Scholar] [CrossRef]
  45. Kirkpatrick, D. Four-level training evaluation model. US Train. Dev. J. 1959, 13, 34–47. [Google Scholar]
  46. The Open University. Digital and Information Literacy Framework; The Open University: Milton Keynes, UK, 2012. [Google Scholar]
  47. JISC. What Is Digital Capability? JISC: Bristol, UK; Available online: https://digitalcapability.jisc.ac.uk/what-is-digital-capability/ (accessed on 18 November 2024).
  48. Handley, F.J.L. Developing Digital Skills and Literacies in UK Higher Education: Recent Developments and a Case Study of the Digital Literacies Framework at the University of Brighton, UK. Rev. Publicaciones 2018, 48, 109–125. [Google Scholar] [CrossRef]
  49. Cerqueira, F.M. Higher Education Students and Digital Literacies: A Systematic Literature Review. Master’s Thesis, The University of Western Ontario (Canada), London, ON, Canada, 2023. [Google Scholar]
  50. Milena, Z.R.; Dainora, G.; Alin, S. Qualitative research methods: A comparison between focus-group and in-depth interview. Ann. Univ. Oradea Econ. Sci. Ser. 2008, 17, 1279–1283. [Google Scholar]
  51. Brown, A.; Danaher, P.A. CHE principles: Facilitating authentic and dialogical semi-structured interviews in educational research. Int. J. Res. Method Educ. 2019, 42, 76–90. [Google Scholar] [CrossRef]
  52. Bailey, J. First steps in qualitative data analysis: Transcribing. Fam. Pract. 2008, 25, 127–131. [Google Scholar] [CrossRef] [PubMed]
  53. Braun, V.; Clarke, V. Reflecting on reflexive thematic analysis. Qual. Res. Sport Exerc. Health 2019, 11, 589–597. [Google Scholar] [CrossRef]
  54. Braun, V.; Clarke, V. Successful Qualitative Research: A Practical Guide for Beginners; Sage Publications Ltd.: London, UK, 2013. [Google Scholar]
  55. Mays, N.; Pope, C. Qualitative research: Rigour and qualitative research. BMJ 1995, 311, 109–112. [Google Scholar] [CrossRef] [PubMed]
  56. Walton, G.; Pickard, A.J.; Dodd, L. Information discernment, mis-information and pro-active scepticism. J. Librariansh. Inf. Sci. 2018, 50, 296–309. [Google Scholar] [CrossRef]
  57. Gill, T. Are Students Who Take the Extended Project Qualification Better Prepared for Higher Education? Research Report; Cambridge University Press & Assessment: Cambridge, UK, 2022. [Google Scholar]
  58. Frand, J. The information-age mindset. Educ. Rev. 2020, 35, 14–24. [Google Scholar]
  59. Koutropoulos, A. Digital natives: Ten years after. MERLOT J. Online Learn. Teach. 2011, 7, 525–538. [Google Scholar]
  60. Shoufan, A.; Mohamed, F. YouTube and education: A scoping review. IEEE Access 2022, 10, 125576–125599. [Google Scholar] [CrossRef]
  61. Copeman, P.; Keightley, P. Academic Skills Rovers: A just in time peer support initiative for academic skills and literacy development. J. Peer Learn. 2014, 7, 1–22. [Google Scholar]
  62. Gnangnon, S.K. Does Aid for Information and Communications Technology Help Reduce the Global Digital Divide? Policy Internet 2019, 11, 344–369. [Google Scholar] [CrossRef]
  63. Buultjens, M.; Robinson, P. Enhancing aspects of the higher education student experience. J. High. Educ. Policy Manag. 2011, 33, 337–346. [Google Scholar] [CrossRef]
  64. Dommett, E.J.; Gardner, B.; van Tilburg, W. Staff and students perception of lecture capture. Internet High. Educ. 2020, 46, 100732. [Google Scholar] [CrossRef]
  65. Carmichael, M.; Reid, A.; Karpicke, J.D. Assessing the Impact of Educational Video on Student Engagement, Critical Thinking and Learning; A SAGE White Paper; SAGE Publishing: New York, NY, USA, 2018. [Google Scholar]
  66. Meer, N.M.; Chapman, A. Assessment for confidence: Exploring the impact that low-stakes assessment design has on student retention. Int. J. Manag. Educ. 2014, 12, 186–192. [Google Scholar] [CrossRef]
  67. LaTour, K.A.; Noel, H.N. Self-directed learning online: An opportunity to binge. J. Mark. Educ. 2021, 43, 174–188. [Google Scholar] [CrossRef]
  68. Son, L.K.; Simon, D.A. Distributed Learning: Data, Metacognition, and Educational Implications. Educ. Psychol. Rev. 2012, 24, 379–399. [Google Scholar] [CrossRef]
  69. Chen, O.; Castro-Alonso, J.C.; Paas, F.; Sweller, J. Extending Cognitive Load Theory to Incorporate Working Memory Resource Depletion: Evidence from the Spacing Effect. Educ. Psychol. Rev. 2018, 30, 483–501. [Google Scholar] [CrossRef]
  70. Hill, M.; Farrelly, N.; Clarke, C.; Cannon, M. Student mental health and well-being: Overview and Future Directions. Ir. J. Psychol. Med. 2020, 41, 259–266. [Google Scholar] [CrossRef] [PubMed]
  71. Gallagher, D.; Gilmore, A. Social integration and the role of student societies in higher education: An exploratory study in the UK. Int. J. Nonprofit Volunt. Sect. Mark. 2013, 18, 275–286. [Google Scholar] [CrossRef]
  72. Baker, E.E.; Edwards, R. How Many Qualitative Interviews Is Enough? Economic and Social Research Council: Swindon, UK, 2012. [Google Scholar]
  73. Maxwell, J.A. Why qualitative methods are necessary for generalization. Qual. Psychol. 2021, 8, 111. [Google Scholar] [CrossRef]
  74. Roald, T.; Køppe, S.; Bechmann Jensen, T.; Moeskjær Hansen, J.; Levin, K. Why do we always generalize in qualitative research? Qual. Psychol. 2021, 8, 69. [Google Scholar] [CrossRef]
  75. Birt, L.; Scott, S.; Cavers, D.; Campbell, C.; Walter, F. Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation? Qual. Health Res. 2016, 26, 1802–1811. [Google Scholar] [CrossRef]
  76. van Laar, E.; van Deursen, A.J.A.M.; van Dijk, J.A.G.M.; de Haan, J. The relation between 21st-century skills and digital skills: A systematic literature review. Comput. Hum. Behav. 2017, 72, 577–588. [Google Scholar] [CrossRef]
Figure 1. The overall content and structure of the EDS programme.
Figure 1. The overall content and structure of the EDS programme.
Education 14 01295 g001
Table 1. A summary of the themes and subthemes that emerged from the data.
Table 1. A summary of the themes and subthemes that emerged from the data.
ThemeSubtheme
Defining digital competencyDigital capabilities are tridimensional
Goal-directed use
Prior learningSpecific qualifications
Self-taught
Peer-to-peer learning
Motives for trainingFear or lack of confidence
Self-identified lack of skill
Comprehensive offering
Certification
Maximising university experience
Learning journeyOnline delivery as a double-edged sword
Diversity of resources and assessment
Varying paces journey
Impact of trainingTeaching and learning
Awareness and proficiency
Communication and digital presence
Wider student experience
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Raji, N.A.S.; Dommett, E.J. Evaluating a University-Wide Digital Skills Programme: Understanding the Student Journey. Educ. Sci. 2024, 14, 1295. https://doi.org/10.3390/educsci14121295

AMA Style

Raji NAS, Dommett EJ. Evaluating a University-Wide Digital Skills Programme: Understanding the Student Journey. Education Sciences. 2024; 14(12):1295. https://doi.org/10.3390/educsci14121295

Chicago/Turabian Style

Raji, Nabila A. S., and Eleanor J. Dommett. 2024. "Evaluating a University-Wide Digital Skills Programme: Understanding the Student Journey" Education Sciences 14, no. 12: 1295. https://doi.org/10.3390/educsci14121295

APA Style

Raji, N. A. S., & Dommett, E. J. (2024). Evaluating a University-Wide Digital Skills Programme: Understanding the Student Journey. Education Sciences, 14(12), 1295. https://doi.org/10.3390/educsci14121295

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop