Abstract
The rise in online learning, accelerated by the COVID-19 pandemic, has led to greater use of synchronous hybrid learning (SHL) in higher education. SHL allows simultaneous teaching of in-person and online learners through videoconferencing tools. Previous studies have identified various benefits (e.g., flexibility) and challenges (e.g., student engagement) to SHL. Whilst systematic reviews have emerged on this topic, few studies have considered the experiences of staff. The aim of this review was threefold: (i) to better understand how staff experiences and perceptions are assessed, (ii) to understand staff experiences in terms of the benefits and challenges of SHL and (iii) to identify recommendations for effective teaching and learning using SHL. In line with the PRISMA guidance, we conducted a systematic review across four databases, identifying 14 studies for inclusion. Studies were conducted in nine different countries and covered a range of academic disciplines. Most studies adopted qualitative methods, with small sample sizes. Measures used were typically novel and unvalidated. Four themes were identified relating to (i) technology, (ii) redesigning teaching and learning, (iii) student engagement and (iv) staff workload. In terms of recommendations, ensuring adequate staff training and ongoing classroom support were considered essential. Additionally, active and collaborative learning were considered important to address issues with interactivity. Whilst these findings largely aligned with previous work, this review also identified limited reporting in research in this area, and future studies are needed to address this.
1. Introduction
Traditionally, teaching and learning within universities has been conducted in person, with only specialist institutions developing and offering distance education. However, over the years, improvements in technology allowing synchronous interactions and advanced learning platforms have meant that the concept of distance learning has largely given way to that of online learning (Siemens et al., 2015), which may or may not be conducted at a distance. Indeed, since the development of the first virtual learning environments (VLEs) in the mid-1990s, use of online learning has steadily increased within conventional universities, with greater reliance on VLEs and tools such as lecture capture to support in-person teaching (Bliuc et al., 2007; Boelens et al., 2015; Dommett, 2019; Dommett et al., 2020). As online learning has grown, universities have been able to combine both in-person and online learning activities. This approach is typically referred to as blended learning (Siemens et al., 2015) or hybrid learning (Raes et al., 2020), although exactly how this is defined has been the subject of much debate (Wang & Huang, 2024). For the purposes of this review, we are referring to blended learning to simply mean “the practices that combine (or blend)” in-person and online learning (Siemens et al., 2015). Blending can happen in different temporal manners because in-person and online can be blended sequentially or in parallel, where sequential allows learners to move from one mode to the next (Staker & Horn, 2012), whilst parallel refers to in-person and online learning happening at the same time but with learners in distinct locations. This parallel blending has been given various labels over the years, including blended synchronous learning (BSL) (Bower et al., 2015), synchronous hybrid learning (SHL) (Butz & Askim-Lovseth, 2015), hybrid virtual classroom (Raes et al., 2020), here or there (HOT) (Lakhal et al., 2020) and hybrid-flexible or HyFlex (Beatty, 2014). Most recently, the term hybrid appears more commonly in the literature, and we will use synchronous hybrid learning (SHL) in this paper to refer to this approach of parallel blending of in-person and online participants. In some cases, this type of learning can include asynchronous components, but in others, the focus is on synchronous classroom and online student learning. Similarly, in some cases, students have a choice over their mode of attendance, whilst in others, they do not.
Irrespective of the exact approach, SHL is thought to provide a promising instructional approach. It can allow students who cannot physically get into the classroom to learn with their peers (Wang et al., 2018). The need for this flexibility is essential given that a proportion of the student body will have caring responsibilities (Runacres et al., 2024) and around half are managing studies alongside part-time employment (Freeman, 2023). Additionally, it can enhance access to education for those in rural or remote areas (Wigal, 2021) and allow for international collaboration, which can enrich learning (Hastie et al., 2010). This approach to learning is thought to have organisational benefits for the institution, including reaching a greater population of students and therefore increasing recruitment potential (Abdelmalak & Parra, 2016) and offering a broader range of optional or elective modules through access to experts and students across geographical boundaries (Bell et al., 2014), which may make them more cost-effective. There are also, arguably, pedagogic benefits, including those derived from bringing in subject experts and having student groups mix together for learning, creating a wider community of learners (Raes et al., 2020). Despite the purported benefits of SHL, there are also some challenges that can arise from the technology needed to teach in this way (Weitze et al., 2013) and the required changes to teaching and learning for this format (Bower et al., 2015). Concerns have been raised about low levels of engagement in those participating online compared to those in person (Lakhal et al., 2020; Wang et al., 2017) and limited interaction between those online and in person (Romero-Hall & Vicentini, 2017).
With the rise in use of SHL, there have been several systematic reviews of the topic, including one covering the benefits and challenges mentioned above (Raes et al., 2020). However, this review was pre-COVID-19, with searches ending in May 2019. This is important because a bibliographic analysis has demonstrated that HyFlex, and therefore SHL, has encountered a 71% growth rate since 2020 (Mahrishi et al., 2025). This analysis identified a broad range of studies, covering topics ranging from learning frameworks to preparation work. It is noteworthy that, despite the range of topics covered, studies rarely focus on assessing staff experiences. A second review, incorporating articles up to and including those published in 2022, focused on the challenges arising from SHL that directly impact online student engagement and the strategies that can address these difficulties (Wang & Huang, 2024). This review identified four challenges of SHL: (i) inadequate online student-to-teacher interactions, (ii) insufficient online student-to-in-person student interactions, (iii) limited online student–content interactions and, finally, (iv) technological constraints. The same review suggested several strategies for overcoming these, including (i) initiating frequent interactions, (ii) letting online students have active roles, (iii) using continuous assessment, (iv) optimising the technology and, finally, (vi) ensuring staff have adequate professional development in the method. However, this review also lacked studies designed to assess staff experience directly. Finally, a scoping review incorporating papers from 2013 to 2022 focused on HyFlex and was conducted to explore the literature for evidence for the use of HyFlex, as well as staff and student experiences of it (Cumming et al., 2024). This review identified 13 studies, only three of which assessed staff experiences (Bower et al., 2015; Butz & Stupnisky, 2016; Hayes & Tucker, 2021). The authors of the review noted that staff had positive beliefs about HyFlex because it allowed students to be connected and engaged in the course contents and the technology could be implemented easily with breakout tools for online students, but that they had difficulties related to increased workload both in advance of the session and in the class (Cumming et al., 2024). However, it should be noted that in all cases, qualitative data from staff was analysed collectively with that of students, meaning it was not possible to separate out staff insights fully.
Previous research has, therefore, identified some potential benefits of SHL as well as some challenges. However, much of the research previously reviewed and conducted has focused on student experiences, and the staff voice has been comparatively quiet. It is critical to ensure that staff experiences are adequately understood because staff buy-in to an approach is critical for its success, given the important role teaching staff play in educating students (Handal & Herrington, 2003; Harris, 2003). Furthermore, previous research indicates that studies that reassert the staff voice can be particularly valuable in education (Goodson, 1992). This is all the more important where universities are under financial strain and prioritisation must occur (Adams, 2025). Additionally, there has yet to be a review identifying best practices for SHL (Lakhal et al., 2021; Li et al., 2023). With the recent increases in SHL, it is now timely to conduct a new systematic review on the topic, with a focus on staff experiences, and address this important gap in the literature. Specifically, in this review, we aimed to address three research questions: (1) What methods and instruments have been used to assess educators’ experiences and perceptions of SHL at universities? (2) How do educators’ experience and perceive SHL teaching in universities in terms of benefits and difficulties? (3) Based on staff experiences, what teaching and learning approaches were recommended in the literature for this mode of teaching?
2. Materials and Methods
This review was completed in accordance with PRISMA guidelines for transparent reporting of systematic reviews (Page et al., 2021). The review was preregistered on the Open Science Framework prior to commencement (https://osf.io/hu4ex/, accessed on 23 July 2025).
2.1. Inclusion and Exclusion Criteria
Inclusion criteria were developed considering the population, concept, context and type of evidence. For population studies were included if the participants were university staff members directly involved in SHL. Studies that focused on alternative populations, such as students or learning designers/developers, were excluded. If a study included multiple groups of participants (e.g., staff and students), the paper could be included only if the data from the participant groups could be differentiated. For the concept, SHL was defined as where a teacher and some students are co-located within a physical classroom and joined via the internet by remote students synchronously. All other forms of learning were excluded. If multiple forms of learning were considered, papers could be included only if the data on SHL could be differentiated from other forms of learning. For context, studies were included if they focused on teaching and learning within a university as part of a degree at either the undergraduate or postgraduate level. Other types of education (e.g., secondary education) were excluded. For types of evidence, we included any peer-reviewed primary research article published in English. Quantitative, qualitative and mixed-methods research studies were included.
2.2. Search Strategy
Systematic searches were carried out across four databases: Web of Science, PsycINFO, Scopus and ERIC. Databases were selected based on size and a combination of general (e.g., Web of Science) and education specialism (e.g., ERIC). Web of Science, Scopus and ERIC are directly searched. For PsycINFO, the interface is OVID. Literature searches were performed up to 24 February 2024. The search strategy was developed using the Population, Exposure, Outcome or PEO Framework (Hosseini et al., 2024). Population terms (lecturer OR professor OR faculty OR academic OR “university staff”) were combined with Exposure terms (hybrid OR HyFlex OR synchronous) and Outcome terms (experience OR perception OR pedagogy). Categories of terms were combined with AND. Searches were conducted in the titles and abstracts of papers.
2.3. Screening, Data Extraction and Synthesis
Search results were exported from each database and imported into Covidence, a web-based collaboration software platform that streamlines the production of systematic and other literature reviews (Veritas Health Innovation, n.d.). Within Covidence, de-duplication was first conducted. After de-duplication, title and abstract screening were completed independently by two authors for >10% of the papers, after which the two screeners compared outcomes. The conflict rate was <10% at 5.9%, and so the first screener continued with the remaining papers. For full text screening, this process was repeated, with two authors independently screening >10% of the papers. There was 100% matching at this stage with no conflicts, and so the first screener continued with the remainder independently. Once full-text screening was completed, a data extraction table was developed within Covidence and completed for five papers by one author. These were reviewed by a second author, and a discussion was held. This process was completed again after a further three papers were extracted, after which the first screener completed the remainder of the data extraction. This process is summarised in Figure 1. Data synthesised using a narrative synthesis approach, driven by the research questions. This approach moves beyond pure description to compare studies and explore the robustness of the evidence (Lisy & Porritt, 2016).
Figure 1.
Prisma diagram providing an overview of the searches and screening conducted in February 2025.
2.4. Quality Appraisal
Quality appraisal was completed using the Mixed Methods Appraisal Tool (MMAT) designed to appraise qualitative research, randomised controlled trials, non-randomised studies, quantitative descriptive studies, and mixed methods studies (Hong et al., 2018). This tool has previously been used in educational research (Jackman et al., 2022). Full details of the quality appraisal for the studies are included in Supplementary Material S1, including the relevant criteria and the appraisal for each study. The studies were rated as good, fair and low quality, with the final quality rating dependent on the percentage of criteria met for the relevant study design (Hong et al., 2018). Those with <40% of the criteria met were rated as low, those with 40–69% were considered fair, and those with >70% were good. Mixed methods studies were assessed against the qualitative, quantitative and mixed methods items and then assigned the lowest of these three quality standards, with the rationale that the overall study quality could not be greater than the ‘weakest’ component. The results of the quality assessment were narratively synthesised to summarise the overall quality of included studies.
3. Results
3.1. Study Characterisation
Table 1 provides an overview of the 14 studies included in the systematic review. Studies were conducted in nine different counties, with the largest cluster conducted within the USA. As might be expected, the studies all reported collecting data during or after the COVID-19 pandemic forced changes to in-person teaching. Whilst one study did not report when data was collected (Li et al., 2023), the study itself was published post-COVID-19. The studies used several different terms to identify what we have referred to in this paper as SHL. This included hybrid synchronous live (Abrar et al., 2024), hybrid (Alcaide & Poza, 2023; Li et al., 2023; Petchamé et al., 2023), HyFlex (Boehm & Boerboom, 2023; Bosman et al., 2022; Dawkins, 2022), synchronous hybrid learning (Capello et al., 2024; Gallardo et al., 2023; Tierney et al., 2024; Usher & Hershkovitz, 2024), blended synchronous learning (Lakhal et al., 2021; Thompson & Helal, 2025) and inclusive synchronous learning activities (Melcher et al., 2025). All except two studies (Abrar et al., 2024; Boehm & Boerboom, 2023) reported the disciplines from which participants were derived, and this was widely spread across different disciplines with staff teaching both undergraduate and postgraduate students.
Table 1.
An overview of the studies included in the systematic review in alphabetical order by first author. Design is described in line with the MMAT for quality appraisal.
3.2. RQ1: Methods and Instruments
To better understand what methods and instruments have been used to assess educators’ experiences and perceptions of SHL at universities, we examined the methodologies of the different studies. Most (N = 7, 50%) studies adopted a qualitative approach (Boehm & Boerboom, 2023; Bosman et al., 2022; Capello et al., 2024; Dawkins, 2022; Lakhal et al., 2021; Petchamé et al., 2023; Thompson & Helal, 2025). These typically had very small sample sizes ranging from 1 to 12 participants (mean N = 7.71). Of those collecting qualitative data, they typically did so with interviews (Boehm & Boerboom, 2023; Lakhal et al., 2021; Thompson & Helal, 2025), but autoethnography was also used (Bosman et al., 2022; Dawkins, 2022), as were surveys (Boehm & Boerboom, 2023; Capello et al., 2024; Petchamé et al., 2023). Only two (14%) took an entirely quantitative approach with sample sizes ranging from 40 to 66 (Abrar et al., 2024; Alcaide & Poza, 2023). The instruments used for the surveys were not previously validated measures for staff. One study adapted parts of the Web-Based Learning Environment Instrument (WEBLEY), designed for students, and then included extra questions from other research (Abrar et al., 2024). The second study used a questionnaire employed in previous research with staff and piloted it on staff in their study before data collection, although the reliability and validity of the scale are unknown (Alcaide & Poza, 2023). The remaining five studies took a mixed-methods approach (Gallardo et al., 2023; Li et al., 2023; Melcher et al., 2025; Tierney et al., 2024; Usher & Hershkovitz, 2024), typically utilising a survey and then focus groups or interviews. As might be expected, these had larger sample sizes for survey components (N ranges from 73 to 281) compared to the interviews or focus groups (N ranges from 9 to 17). Surveys used within these studies were generally developed for the study and therefore not previously validated (Gallardo et al., 2023; Melcher et al., 2025; Tierney et al., 2024; Usher & Hershkovitz, 2024), although one was adapted from a study by the same authors where it was created for assessing student experiences (Li et al., 2023). In summary, much of the work conducted to date has utilised qualitative methods, and where quantitative surveys have been used, they have typically been designed for the specific study or adapted from previous instruments used for assessing student experience. The lack of previously validated instruments means that the results of these studies should be interpreted with caution. However, it is also noteworthy that no validated measure specifically for SHL exists at the time of writing.
3.3. Experiences and Perception of SHL
Staff experiences of SHL were assessed using a range of research questions, which created a diverse set of findings, detailed by study in Table 1. Across studies, several themes can be identified from the data relating to (i) technology, (ii) redesigning teaching and learning, (iii) student engagement and (iv) staff workload. Firstly, studies revealed that staff generally felt that the technology available to them was suitable for the purposes of SHL teaching (Abrar et al., 2024; Alcaide & Poza, 2023) and that some enjoyed working with new tools (Boehm & Boerboom, 2023) and the pedagogy these afforded (Bosman et al., 2022; Thompson & Helal, 2025). It was suggested that the technology could be used to track learning and support inclusivity (Lakhal et al., 2021). Despite this, technology could still be a source of frustration (Boehm & Boerboom, 2023; Capello et al., 2024; Melcher et al., 2025; Tierney et al., 2024), which could compromise their professionalism (Melcher et al., 2025). It was noted that this mode required students to have good internet connectivity (Lakhal et al., 2021; Petchamé et al., 2023; Tierney et al., 2024), although this generally had been found to be the case (Li et al., 2023). The presence of technical difficulties was perceived as impacting student experience (Melcher et al., 2025).
Secondly, studies revealed that staff typically found teaching using SHL required careful designing of classes and typically required a change in pedagogic approach to support both modes of learning (Bosman et al., 2022; Capello et al., 2024; Lakhal et al., 2021; Melcher et al., 2025; Thompson & Helal, 2025). This, in turn, had implications for assessment and therefore could require university-level approvals before the approach can be optimised (Dawkins, 2022; Li et al., 2023). Different types of activity were utilised in SHL to bring the two groups of students together in some cases (Melcher et al., 2025; Petchamé et al., 2023; Tierney et al., 2024) or allow them to work in ‘within-mode’ groups to respect their preferences for online or in-person learning (Bosman et al., 2022). Despite making changes to learning activities and assessment, it was not always deemed to be successful in contrast to in-person learning, with studies revealing that staff felt SHL was less effective than in-person learning (Abrar et al., 2024; Alcaide & Poza, 2023). However, this experience was not universal; some saw it as an efficient and easy way to teach two groups (Petchamé et al., 2023). Staff perceived the flexibility of SHL as a benefit to students who could not attend or who lived far from campus (Melcher et al., 2025; Petchamé et al., 2023; Thompson & Helal, 2025) and that this mode could allow internationalisation of the classroom through global interactions (Melcher et al., 2025; Usher & Hershkovitz, 2024), indicating that even if changes to practice were required and learning was not always as effective, there were still benefits to this mode. For assessment, some studies indicated staff perceived that this mode of teaching could result in unfair assessment of performance (Abrar et al., 2024), whilst others saw assessment and feedback practices as similar across modes (Gallardo et al., 2023). Various reasons were given for assessment difficulties, including difficulty in interaction even though students could ask questions (Abrar et al., 2024). Despite some studies raising concerns about the effectiveness of the learning process, there was no consensus from staff on whether students performed the same as when they had in-person teaching (Alcaide & Poza, 2023).
Thirdly, several studies identified issues around student engagement, both in terms of attendance in the different modes and for in-class interactions. For attendance, staff reported challenges in managing different-sized cohorts online and in person and managing the unpredictability of the model of attendance or differential drop-off, especially where asynchronous was also an option (Dawkins, 2022; Thompson & Helal, 2025). Staff reported a lack of satisfaction with interactions (Abrar et al., 2024; Alcaide & Poza, 2023; Li et al., 2023; Tierney et al., 2024), particularly for the students online (Petchamé et al., 2023), with one study indicating staff viewed students more as observers or outsiders in this mode when compared to in-person teaching, where students were an active audience or team player (Usher & Hershkovitz, 2024). However, the satisfaction may be dependent on prior teaching experiences, where those used to fully online teaching found HyFlex interactions positive, whilst those used to fully in-person found SHL interaction negative (Boehm & Boerboom, 2023). Irrespective of this, the disengagement of online students in SHL was noted as reducing the quality of the student experience (Tierney et al., 2024).
Fourthly, and likely related to the issues around pedagogy and engagement, several studies revealed that staff identified workload as an issue in SHL, noting the extra time required for preparing classes and within the sessions themselves, due to having to monitor two groups of students (Abrar et al., 2024; Boehm & Boerboom, 2023; Bosman et al., 2022; Capello et al., 2024; Gallardo et al., 2023; Li et al., 2023; Melcher et al., 2025; Tierney et al., 2024). This created a perception of juggling or multi-tasking (Usher & Hershkovitz, 2024). Related to this, staff resourcing was identified as a consideration for SHL. Although staff in one study felt that they had the necessary digital skills to teach in this mode (Abrar et al., 2024), studies revealed that training was needed (Capello et al., 2024; Lakhal et al., 2021; Tierney et al., 2024). This training needed to be interactive and include having IT support staff on hand for the early sessions (Li et al., 2023; Thompson & Helal, 2025; Tierney et al., 2024). Furthermore, it was necessary to have a consistent room setup to avoid difficulties when classes moved (Lakhal et al., 2021; Tierney et al., 2024). Along with IT support resourcing, several studies identified the need for teaching assistants to support HyFlex sessions (Lakhal et al., 2021; Li et al., 2023; Thompson & Helal, 2025; Tierney et al., 2024).
In summary, staff could generally see the benefits to SHL for inclusivity and flexibility. However, teaching using SHL was associated with a higher workload in both planning and delivery as well as requiring training and ongoing support. Interactivity remains a contentious issue for staff, as do technical challenges, both of which can lead to poor student experience.
3.4. RQ3: Best-Supported Teaching and Learning Approaches for SHL
Our final research question sought to identify effective teaching and learning approaches for SHL. Several of the studies identified did not make recommendations or conclusions about this (Alcaide & Poza, 2023; Boehm & Boerboom, 2023; Gallardo et al., 2023; Usher & Hershkovitz, 2024). Details of the recommendations for the remaining studies are shown in Table 2. From Table 2, it is apparent that providing adequate staff training was considered essential (Abrar et al., 2024; Capello et al., 2024; Lakhal et al., 2021; Li et al., 2023; Melcher et al., 2025; Petchamé et al., 2023; Thompson & Helal, 2025). Additionally, ongoing support was recommended, in the form of classroom assistants (Bosman et al., 2022; Capello et al., 2024; Lakhal et al., 2021; Li et al., 2023; Melcher et al., 2025) and on-call tech support (Lakhal et al., 2021). For the former, studies differed on whether this classroom assistance could come from a current student (Capello et al., 2024; Melcher et al., 2025). Although no studies directly compared different pedagogic approaches, most noted the need for an interactive course design to ensure engagement of online students (Abrar et al., 2024; Bosman et al., 2022; Capello et al., 2024; Lakhal et al., 2021; Petchamé et al., 2023; Tierney et al., 2024). A range of techniques were suggested for achieving this, including group work (Abrar et al., 2024), shared documents and screens (Abrar et al., 2024; Bosman et al., 2022), use of breakout rooms (Bosman et al., 2022; Lakhal et al., 2021) and polls or questioning (Abrar et al., 2024; Bosman et al., 2022).
Table 2.
An overview of recommendations for implementing SHL.
3.5. Quality of Studies
Overall, the quality of the studies varied. Full details of the quality appraisal for the studies are included in Supplementary Material S1. Fifty percent (N = 7) of the studies were rated to be low quality, although it is noteworthy that three of these were mixed-methods designs where one component was arguably of higher quality than the other, and this has impacted overall quality negatively. Thirty-six percent (N = 5) were rated as fair, and just 14% (N = 2) were rated as good. Lower quality ratings were often driven by lack of thorough reporting; across all studies 15% of the criteria were rated as ‘cannot tell’. The two quantitative studies were rated as low (Abrar et al., 2024) and fair (Alcaide & Poza, 2023). In both cases, it was unclear if the sample was representative of the population and the appropriateness of the measures, but both studies did include suitable statistics. Purely qualitative studies tended to fare better, with two rated as good (Lakhal et al., 2021; Petchamé et al., 2023) and three as fair (Boehm & Boerboom, 2023; Thompson & Helal, 2025; Usher & Hershkovitz, 2024). For the fair-rated studies, the most common issues arose around whether the interpretation of the data was sufficiently substantiated by the data and if there was coherence across the process. The remaining qualitative studies were rated as poor (Bosman et al., 2022; Capello et al., 2024; Dawkins, 2022), but this was largely driven by lack of reporting. For the mixed-methods studies, three were rated as poor (Gallardo et al., 2023; Melcher et al., 2025; Tierney et al., 2024), although the qualitative components were typically stronger with limited reporting or justification for the mixed methods. One was rated as moderate (Li et al., 2023).
4. Discussion
This review set out to address three questions, the first of which was to identify the methods and instruments used to assess educators’ experiences and perceptions of SHL. Our data indicate that most of the studies took a qualitative approach, using a range of methods from autoethnography to qualitative surveys. Where quantitative scales were used, sample sizes were typically larger, but most studies made use of novel or newly adapted scales that were unvalidated, which could limit their generalisability.
The second question we set out to answer was to understand how educators’ experience and perceive SHL teaching in universities in terms of benefits and difficulties. The findings here revealed four themes. Firstly, technology was seen as both something that could serve SHL well and a source of frustration. These findings are in keeping with earlier studies (Wang & Huang, 2024; Weitze et al., 2013) and suggest that despite the increase in implementation and research into SHL in recent years (Mahrishi et al., 2025), technological challenges remain ever present. Secondly, staff reported needing to redesign teaching and assessment to ensure both online and in-person students were supported. Despite the changes needed, staff did still recognise benefits to this approach, most notably in allowing flexibility for students and internationalisation. Again, these findings align well with previous studies and reviews on SHL benefits and challenges (Bower et al., 2015; Hastie et al., 2010; Raes et al., 2020; Wang et al., 2018). Coverage of assessment in the context of SHL was more extensive in the studies reviewed here than in previous reviews, possibly because this has now been considered more as universities have moved beyond emergency regulations adopted during COVID to using SHL as a permanent or more established method. Staff perceptions of how SHL impacts assessment and whether students perform the same and receive similar feedback lack consensus currently. Thirdly, concerns were raised regarding the level of interactivity and student engagement, particularly for online students. These concerns have previously been raised in studies, including reviews (Lakhal et al., 2020; Romero-Hall & Vicentini, 2017; Wang & Huang, 2024; Wang et al., 2017), and indeed, student engagement is an ongoing concern with online learning (Siemens et al., 2015). We speculate that difficulties with engagement within SHL are an extension of other forms of online learning but may feel greater due to the direct comparison with in-person students simultaneously learning. Certainly, staff experiences of interactions in one study were influenced by their previous modes of teaching (Boehm & Boerboom, 2023). The fourth theme identified related to workload, with staff reporting higher workload required for planning and during the actual teaching sessions, which required careful monitoring of two distinct groups of students. This has also been noted in a previous review (Cumming et al., 2024). Related to this, the current studies identified the need to have sufficient training, also previously noted in reviews (Cumming et al., 2024), but also the presence of classroom support both for IT and more generally. In summary, the findings of the current review align with those of previous studies, identifying similar benefits and challenges to SHL. The findings extend previous reviews with greater consideration of assessment, although no consensus has been reached, and more discussion of workload challenges.
The final question we aimed to answer with this review was what teaching and learning approaches were recommended in the literature for this mode of teaching. Not all the included studies provided specific recommendations for effective teaching and learning approaches. However, those that did suggested methods to address some of the challenges they had also identified. For example, adequate staff training and ongoing classroom support were recommended for implementing SHL. This support included not only IT staff but also those fulfilling more of a classroom or teaching assistant role. The aim of the latter was to reduce cognitive load for staff and ensure online students are interacted with and as engaged as possible by monitoring platforms to ensure they are not overlooked. No studies directly compared distinct pedagogic approaches, but all identified a need for interactive and engaging sessions, which utilised active and collaborative learning approaches. There were some suggestions for specific ways to achieve this, such as polling, shared documents and pairing of in-person and online students, as well as more use of video and audio compared to chat. These suggestions align with a previous review on the topic (Wang & Huang, 2024), indicating that the incorporation of newer research has not markedly changed the recommended practices.
These have implications for institutional practice and policy. Firstly, the findings of this review indicate that institutions should be offering comprehensive training in both pedagogy and technology use of SHL to optimise interactivity. Importantly, this support should be ongoing, e.g., through on-site tech support and a teaching assistant. Secondly, the findings suggest that workload may be considerably higher for SHL compared to either in-person or online learning conducted separately, and as such, staff need more time to develop this kind of teaching, and, where used, this should be reflected in workload planning models.
This review has consolidated the findings of previous studies and reviews. It has demonstrated that despite the rise in use of and research about SHL in recent years, moving beyond the emergency arrangements made during the COVID-19 pandemic, the perceived challenges, benefits and recommendations are similar. However, there are limitations to the review. Firstly, only a relatively small number of studies were included. Secondly, whilst staff data was collected as part of these studies, as required by our inclusion criteria, many studies also included student data, which meant few provided a thorough and in-depth analysis of staff experiences. Thirdly, half of the studies included were rated as low quality, and only a small proportion were good. In most cases, studies simply lacked detail for higher quality ratings. Common problems are related to lack of clarity around the representativeness of the sample and appropriateness of the measures as well as lack of advanced analysis techniques. Furthermore, the small number of studies, combined with heterogenous methods and lack of detailed information, has prevented comparisons between contexts, for example, between academic disciplines or countries. Additionally, although nine countries were represented, the sample was dominated by four high-income Western countries (USA, UK, Spain and Australia), meaning there was a lack of diversity. Future research should consider more robust methodologies, with larger sample sizes, which would permit such comparisons, particularly for quantitative work. Furthermore, reporting should follow guidance and checklists for educational interventions (Upsher et al., 2025).
5. Conclusions
In conclusion, SHL may offer both practical and pedagogical advantages, as has been previously reported. Staff experiences indicate that whilst they can perceive benefits to SHL, there can be challenges with technology, pedagogic adaptations, student engagement and workload. The literature indicated that this can be mitigated to some extent with appropriate approaches and ongoing staff support. These issues must be carefully investigated, and different solutions examined and compared. However, it is critical to note the relatively small pool of literature focused on staff—future research must examine staff experience more fully and adopt robust data collection and reporting methods to ensure findings are generalisable beyond specific universities or programmes of study.
Supplementary Materials
The following supporting information can be downloaded at https://www.mdpi.com/article/10.3390/educsci15080987/s1, File S1: Quality Assessment of Included Studies Using the Mixed Methods Appraisal Tool (MMAT).
Author Contributions
Conceptualization, E.J.D. and M.D.; methodology, E.J.D., M.D. and H.C.W.; formal analysis, E.J.D. and H.C.W.; investigation, E.J.D., H.C.W. and M.D.; writing—original draft preparation, E.J.D.; writing—review and editing, E.J.D., H.C.W. and M.D.; funding acquisition, M.D. and E.J.D. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by the Circle U. Alliance project, “Hybrid Circle U.: for inclusion, flexibility and internationalization”. The funding was awarded to a collaboration across six Circle U universities.
Institutional Review Board Statement
“Not applicable”—This study was a review and did not involve any data collection.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Abdelmalak, M. M. M., & Parra, J. L. (2016). Expanding learning opportunities for graduate students with HyFlex course design. International Journal of Online Pedagogy and Course Design (IJOPCD), 6(4), 19–37. [Google Scholar] [CrossRef]
- Abrar, A.-E., Doha Saleh, A., Dalal, A.-E., & Fatima, A. (2024). An analysis of the academic effectiveness of hybrid learning: The experiences of faculty and students in kuwait. Journal of Applied Research in Higher Education, 16(2), 328–342. [Google Scholar] [CrossRef]
- Adams, R. (2025, February 1). Quarter of leading UK universities cutting staff due to budget shortfalls. The Guardian. [Google Scholar]
- Alcaide, M. Á., & Poza, E. D. L. (2023). Was the incorporation of microsoft teams in higher education an effective tool as a result of the Covid-19 pandemic? Journal of Higher Education Theory and Practice, 23(7), 157–167. [Google Scholar] [CrossRef]
- Beatty, B. (2014). Hybrid courses with flexible participation: The HyFlex course design. In L. Kyei-Blankson, & E. Ntuli (Eds.), Practical applications and experiences in K-20 blended learning environments (pp. 153–177). IGI Global. [Google Scholar] [CrossRef]
- Bell, J., Sawaya, S., & Cain, W. (2014). Synchromodal classes: Designing for shared learning experiences between face-to-face and online students. International Journal of Designs for Learning, 5(1). [Google Scholar] [CrossRef]
- Bliuc, A.-M., Goodyear, P., & Ellis, R. A. (2007). Research focus and methodological choices in studies into students’ experiences of blended learning in higher education. The Internet and Higher Education, 10(4), 231–244. [Google Scholar] [CrossRef]
- Boehm, M., & Boerboom, S. (2023). Faculty experiences of HyFlex: An exploratory study. Educational Research: Theory and Practice, 34(2), 43–47. Available online: https://search.ebscohost.com/login.aspx?direct=true&db=eric&AN=EJ1395229&site=ehost-live (accessed on 18 March 2025).
- Boelens, R., Van Laer, S., De Wever, B., & Elen, J. (2015). Blended learning in adult education: Towards a definition of blended learning. Available online: https://biblio.ugent.be/publication/6905076/file/6905079 (accessed on 18 March 2025).
- Bosman, L. B., Wollega, E., & Naeem, U. (2022). Responsive educational transformations during emergency situations: Collaborative autoethnography applied to the engineering classroom. International Journal of Engineering Education, 38(2), 288–298. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-85161488226&partnerID=40&md5=d7fb5fb3da555827c95241f03d375160 (accessed on 18 March 2025).
- Bower, M., Dalgarno, B., Kennedy, G. E., Lee, M. J. W., & Kenney, J. (2015). Design and implementation factors in blended synchronous learning environments: Outcomes from a cross-case analysis. Computers & Education, 86, 1–17. [Google Scholar] [CrossRef]
- Butz, N. T., & Askim-Lovseth, M. K. (2015). Oral communication skills assessment in a synchronous hybrid MBA programme: Does attending face-to-face matter for US and international students? Assessment & Evaluation in Higher Education, 40(4), 624–639. [Google Scholar] [CrossRef]
- Butz, N. T., & Stupnisky, R. H. (2016). A mixed methods study of graduate students’ self-determined motivation in synchronous hybrid learning environments. The Internet and Higher Education, 28, 85–95. [Google Scholar] [CrossRef]
- Capello, S. A., Gyimah-Concepcion, M., & Buckley-Hughes, B. (2024). Using telepresence robots for doctoral education: Student and faculty experiences. American Journal of Distance Education, 38(4), 374–388. [Google Scholar] [CrossRef]
- Cumming, T. M., Han, C., & Gilanyi, L. (2024). University student and instructor experiences with HyFlex learning: A scoping review. Computers and Education Open, 7, 100229. [Google Scholar] [CrossRef]
- Dawkins, R. (2022). Hybrid-flexible (HyFlex) subject delivery and implications for teaching workload: A ‘small data’ analysis of one academic’s first-hand experience in 2021 and 2022. Australian Universities’ Review, 64(2), 61–69. Available online: https://search.ebscohost.com/login.aspx?direct=true&db=eric&AN=EJ1389651&site=ehost-live (accessed on 20 February 2025).
- Dommett, E. J. (2019). Understanding the use of online tools embedded within a virtual learning environment. International Journal of Virtual and Personal Learning Environments, 9(1), 39–55. [Google Scholar] [CrossRef]
- Dommett, E. J., Gardner, B., & van Tilburg, W. (2020). Staff and students perception of lecture capture. The Internet and Higher Education, 46, 100732. [Google Scholar] [CrossRef]
- Freeman, J. (2023). HEPI research shows nearly half of universities ‘promote’ part-time work. Higher Education Policy Institute (HEPI). [Google Scholar]
- Gallardo, K., Glasserman, L., Rivera, N., & Martínez-Cardiel, L. (2023). Learning assessment challenges from students and faculty perception in times of COVID-19: A case study. Contemporary Educational Technology, 15(2), 415. Available online: https://search.ebscohost.com/login.aspx?direct=true&db=eric&AN=EJ1385477&site=ehost-live (accessed on 18 March 2025). [CrossRef]
- Goodson, I. F. (1992). Studying teachers’ lives. Routledge. [Google Scholar]
- Handal, B., & Herrington, A. (2003). Mathematics teachers’ beliefs and curriculum reform. Mathematics Education Research Journal, 15(1), 59–69. [Google Scholar] [CrossRef]
- Harris, A. (2003). Behind the classroom door: The challenge of organisational and pedagogical change. Journal of Educational Change, 4(4), 369–382. [Google Scholar] [CrossRef]
- Hastie, M., Hung, I.-C., Chen, N.-S., & Kinshuk. (2010). A blended synchronous learning model for educational international collaboration. Innovations in Education and Teaching International, 47(1), 9–24. [Google Scholar] [CrossRef]
- Hayes, S., & Tucker, H. (2021). Using synchronous hybrid pedagogy to nurture a community of inquiry: Insights from a tourism Master’s programme. Journal of Hospitality, Leisure, Sport & Tourism Education, 29, 100339. [Google Scholar] [CrossRef]
- Hong, Q. N., Pluye, P., Fàbregues, S., Bartlett, G., Boardman, F., Cargo, M., Dagenais, P., Gagnon, M.-P., Griffiths, F., & Nicolau, B. (2018). Mixed methods appraisal tool (MMAT), version 2018. Registration of Copyright, 1148552(10), 1–7. [Google Scholar]
- Hosseini, M.-S., Jahanshahlou, F., Akbarzadeh, M. A., Zarei, M., & Vaez-Gharamaleki, Y. (2024). Formulating research questions for evidence-based studies. Journal of Medicine, Surgery, and Public Health, 2, 100046. [Google Scholar] [CrossRef]
- Jackman, P. C., Jacobs, L., Hawkins, R. M., & Sisson, K. (2022). Mental health and psychological wellbeing in the early stages of doctoral study: A systematic review. European Journal of Higher Education, 12(3), 293–313. [Google Scholar] [CrossRef]
- Lakhal, S., Mukamurera, J., Bédard, M.-E., Heilporn, G., & Chauret, M. (2020). Features fostering academic and social integration in blended synchronous courses in graduate programs. International Journal of Educational Technology in Higher Education, 17(1), 5. [Google Scholar] [CrossRef]
- Lakhal, S., Mukamurera, J., Bédard, M.-E., Heilporn, G., & Chauret, M. (2021). Students and instructors perspective on blended synchronous learning in a Canadian graduate program. Journal of Computer Assisted Learning, 37(5), 1383–1396. [Google Scholar] [CrossRef]
- Li, K. C., Wong, B. T. M., Kwan, R., Chan, H. T., Wu, M. M. F., & Cheung, S. K. S. (2023). Evaluation of hybrid learning and teaching practices: The perspective of academics. Sustainability, 15(8), 780. [Google Scholar] [CrossRef]
- Lisy, K., & Porritt, K. (2016). Narrative Synthesis: Considerations and challenges. JBI Evidence Implementation, 14(4), 201. Available online: https://journals.lww.com/ijebh/fulltext/2016/12000/narrative_synthesis__considerations_and_challenges.33.aspx (accessed on 18 March 2025). [CrossRef]
- Mahrishi, M., Abbas, A., Siddiqui, M. K., & Aladhadh, S. (2025). The genesis and prevalence of the HyFlex model: A systematic review and bibliometric analysis. International Journal of Educational Research Open, 8, 100410. [Google Scholar] [CrossRef]
- Melcher, M., Rutherford, J., Secker, J., Wells, R., & Knight, R. A. (2025). Evaluating hybrid teaching practices: A case study of staff and student experiences at City St George’s, University of London. Cogent Education, 12(1), 2448356. [Google Scholar] [CrossRef]
- Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ, 372, n71. [Google Scholar] [CrossRef] [PubMed]
- Petchamé, J., Iriondo, I., Korres, O., & Paños-Castro, J. (2023). Digital transformation in higher education: A qualitative evaluative study of a hybrid virtual format using a smart classroom system. Heliyon, 9(6), e16675. [Google Scholar] [CrossRef] [PubMed]
- Raes, A., Detienne, L., Windey, I., & Depaepe, F. (2020). A systematic literature review on synchronous hybrid learning: Gaps identified. Learning Environments Research, 23(3), 269–290. [Google Scholar] [CrossRef]
- Romero-Hall, E., & Vicentini, C. R. (2017). Examining distance learners in hybrid synchronous instruction: Successes and challenges. Online Learning Journal, 21(4), 141–157. [Google Scholar] [CrossRef]
- Runacres, J., Herron, D., Buckless, K., & Worrall, S. (2024). Student carer experiences of higher education and support: A scoping review. International Journal of Inclusive Education, 28(7), 1275–1292. [Google Scholar] [CrossRef]
- Siemens, G., Gašević, D., & Dawson, S. (2015). Preparing for the digital university: A review of the history and current state of distance, blended and online learning. Available online: https://research.monash.edu/files/256525723/256524746_oa.pdf (accessed on 18 March 2025).
- Staker, H., & Horn, M. B. (2012). Classifying K-12 blended learning. Innosight Institute. [Google Scholar]
- Thompson, J., & Helal, J. (2025). Here and elsewhere, together: How emerging blended synchronous learning approaches and perceptions can inform teaching guidance and support. Educational Research and Evaluation. [Google Scholar] [CrossRef]
- Tierney, A., Hopwood, I., & Davies, S. (2024). Staff and student experiences of hybrid teaching in a pandemic-impacted context. Research and Practice in Technology Enhanced Learning, 19, 017. [Google Scholar] [CrossRef]
- Upsher, R., Dommett, E., Carlisle, S., Conner, S., Codina, G., Nobili, A., & Byrom, N. C. (2025). Improving reporting standards in quantitative educational intervention research: Introducing the CLOSER and CIDER checklists. Journal of New Approaches in Educational Research, 14(1), 2. [Google Scholar] [CrossRef]
- Usher, M., & Hershkovitz, A. (2024). From guides to jugglers, from audience to outsiders: A metaphor analysis of synchronous hybrid learning. Learning Environments Research, 27(1), 1–16. [Google Scholar] [CrossRef]
- Veritas Health Innovation. (n.d.). Covidence systematic review software. Available online: www.covidence.org (accessed on 16 January 2025).
- Wang, Q., Huang, C., & Quek, C. L. (2018). Students’ perspectives on the design and implementation of a blended synchronous learning environment. Australasian Journal of Educational Technology, 34(1). [Google Scholar] [CrossRef]
- Wang, Q., & Huang, Q. (2024). Engaging online learners in blended synchronous learning: A systematic literature review. IEEE Transactions on Learning Technologies, 17, 594–607. [Google Scholar] [CrossRef]
- Wang, Q., Quek, C. L., & Hu, X. (2017). Designing and improving a blended synchronous learning environment: An educational design research. The International Review of Research in Open and Distributed Learning, 18(3). [Google Scholar] [CrossRef]
- Weitze, C. L., Ørngreen, R., & Levinsen, K. (2013, October 30–31). The global classroom video conferencing model and first evaluations. Proceedings of the 12th European Conference on E-Learning: SKEMA Business School, Sophia Antipolis, France. [Google Scholar]
- Wigal, C. M. (2021). Teaching the design process in a HyFlex environment. Journal of Higher Education Theory and Practice, 21(10). [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).