Next Article in Journal
Using a Cooperative Educational Game to Promote Pro-Environmental Engagement in Future Teachers
Next Article in Special Issue
Can Pedagogical Innovations Be Sustainable? One Evaluation Outlook for Research Developed in Portuguese Higher Education
Previous Article in Journal
Digital Teaching Competence in Higher Education: A Systematic Review
Previous Article in Special Issue
Participation in the Assessment Processes in Problem-Based Learning: Experiences of the Students of Social Sciences in Lithuania
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessment and Evaluation in Active Learning Implementations: Introducing the Engineering Education Active Learning Maturity Model

by
Humberto Arruda
* and
Édison Renato Silva
Production Engineering Program, Universidade Federal do Rio de Janeiro, Rio de Janeiro 21941-909, Brazil
*
Author to whom correspondence should be addressed.
Educ. Sci. 2021, 11(11), 690; https://doi.org/10.3390/educsci11110690
Submission received: 31 August 2021 / Revised: 22 October 2021 / Accepted: 24 October 2021 / Published: 29 October 2021
(This article belongs to the Special Issue Assessment and Evaluation in Higher Education)

Abstract

:
With the technological changes experienced in the world in recent decades, society has changed as a whole, due to the speed and availability of information that exists today. As student attention decreases, critical thinking and Active Learning, which places the student at the center of the learning process, have gained prominence. Considering the growing popularity of these techniques, this article proposes the Engineering Education Active Learning Maturity Model (E2ALM2), a framework that allows practitioners to assess the current maturity of Active Learning implementation in a program or a course. E2ALM2 was built from a literature review of key success factors (KSF) for Active Learning implementations, which were divided into dimensions. Each KSF is composed of constructs, which are detailed with variables. Each variable has a proposed measurement method and an estimated uncertainty level. The framework can support diagnosis and practical improvements in real settings.

1. Introduction

Since the beginning of the second half of the 20th century, the world has gone through technological evolutions that have transformed several areas of knowledge. Since the appearance of the first computers, data processing capacity and speed have increased exponentially, and this has led people and society to new behaviors. Education in general has changed, and so has engineering education [1,2].
With the transformations experienced in recent decades, current students were born surrounded by many technological resources. With almost all the information available on mobile phones, knowing how to make sense of it becomes increasingly important.
Engineering schools are experiencing a global trend of adaptation of their programs to the reality of the 21st century. Several movements are attempting to modernize programs and teaching practices, such as the CDIO initiative [3]. This initiative “provides students with an education stressing engineering fundamentals set in the context of Conceiving—Designing—Implementing—Operating (CDIO) real-world systems and products” [4]. Additionally, accreditation criteria of engineering programs in USA, established by the Accreditation Board for Engineering and Technology—ABET (called EC2000) [5], have changed. Such novel criteria require US engineering departments to demonstrate that, in addition to having a solid knowledge of science, math, and engineering fundamentals, their graduates have communication skills, multidisciplinary teamwork capabilities, lifelong learning skills, and awareness of the social and ethical considerations associated with the engineering profession [6]. Finally, completely novel engineering colleges are being created, with totally different proposals from the traditional 20th century model, such as the Olin College [7] and Aalborg University [8].
A common topic among the engineering modernization movements is the importance of placing the student at the center of the learning process, as highlighted in the learning outcomes of the EC2000 (Criterion 3. i—“a recognition of the need for, and an ability to engage in life-long learning”) [9] and Standard 8 of the CDIO (“Active Learning”) (p. 153, [3]). Putting the student at the center of the learning process, along with increasing student engagement, is arguably achieved by the use of Active Learning [10,11,12,13,14,15,16,17,18,19,20,21,22].
Active Learning still lacks a definitive unique definition, but three stand as the most popular. Prince defines it as “any instructional method [used in the classroom] that engages students in the learning process” [23], Roehl as “an umbrella term for pedagogies focusing on student activity and student engagement in the learning process” [24], and Barkley as “an umbrella term that now refers to several models of instruction, including cooperative and collaborative learning, discovery learning, experiential learning, problem-based learning, and inquiry-based learning” [14]. Hartikainen [10] shows 66 definitions of Active Learning, grouped by three main categories: (1) defined and viewed as an instructional approach; (2) not defined but viewed as an instructional approach; and (3) not defined but viewed as a learning approach.
Among the main Active Learning techniques, the following stand out: Problem-Based Learning (PBL) [8,23,25,26,27,28,29], Cooperative and Collaborative Learning [13,23,30,31,32,33,34,35], and the Flipped Classroom [20,36,37,38,39].
Furthermore, the pedagogical results and effectiveness of Active Learning are also widely documented [19,23,40,41,42,43,44,45]. Hartikainen [10] related positive effects on the development of subject-related knowledge, professional skills, social skills, communication skills, and meta-competences.
However, there are problems both in research and in the implementation of Active Learning. Prince [23] points out that comprehensive assessment of Active Learning is difficult due to the limited range of learning outcomes and different possible interpretations of these outcomes. Streveler [46] notes that “active learning is not a panacea that is a blanket remedy for all instructional inadequacies. Instead, it is a collective term for a group of instructional strategies that produce different results and require differing degrees of time to design, implement, and assess”. Fernandes [47] related that “students identify the heavy workload which the project entails as one of the main constraints of PBL approach”. There are also the least researched, but much-mentioned, barriers of resistance to novelty on the part of lecturers and students [43,48,49,50,51,52].
Although Active Learning has already been validated as an effective way to influence student learning and is increasingly being incorporated into the classroom, there is no way to qualify and evaluate the use of Active Learning techniques by faculty members [40]. There are four maturity models in the field of education, but none that specifically allow the assessment of the implementation of Active Learning in a course [53,54,55,56]. In addition to the difficulty of measuring Active Learning usage in the classroom, there is no way to assess the maturity level of Active Learning implementations in a course or a program of a Higher Education Institution (HEI), engineering schools included. This gap blurs the diagnostics of the status of a given implementation and consequently leads to less assertiveness in decision making, reducing the effectiveness of changes and Active Learning as a whole.
Maturity models can be a bridge to this gap. They enable practitioners to assess organizational performance, support management, and allow improvements [57]. Maturity modeling is a generic approach that describes the development of an organization over time through ideal levels to a final state [58]. In addition, maturity models are instruments to assess organizational elements and select appropriate actions, which lead to higher levels of maturity and better performance [59].
Therefore, this work will propose a conceptual maturity model that allows evaluating Active Learning implementations at the level of a specific course. This model targets the incremental enhancement of courses and seems logically to be the first step towards a more general and comprehensive framework that can extend its reach to evaluate institutions as a whole.

2. Methodology

Based on the research objectives, the broad keyword “active learning” was used in Scopus and Web of Science databases to search for abstracts of peer-reviewed journal articles. Additional keywords related to “success factors” and “engineering education” were used to refine the search. Ultimately, 31 studies were selected for review. Figure 1 uses the PRISMA model [60,61] to describe the literature review process.
The initial search returned a total number of 13,029 articles. Approximately 25% (3306) of the records were excluded because they belonged to categories other than education. The objective of this criterion was to exclude articles that used "active learning" in different purposes.
With the sample reduced to 75% of the original size (9723), filters were applied in the databases to match keywords related to success factors: “critic* factor*”, “key factor*”, and “success factor*”. This step led to the reduction of the sample to 127 articles.
The abstracts of these 127 articles were judged against the following inclusion criteria: (1) reported research on key factors and (2) written in English. These criteria were intended to eliminate articles that had some keywords related to success factors but that did not actually address them. This step resulted in the reduction of the sample to 42 articles, whose full texts were searched. Of these, 11 full texts were not available for download, which resulted in the selection of 31 references that were included in the literature review. After the literature selection stage, references were read to identify the Key Success Factors (KSF) for the implementation of Active Learning.
The software MaxQDA® was used to extract and accumulate text snippets that represented key success factors. Then, similar snippets were combined into single KSFs to avoid duplication. Next, a definition based on the literature was attributed to each factor. The following step was to define the relevant constructs for each factor and for each construct, the variables that would be used for measurement.
Finally, each variable had a measurement method proposed and an uncertainty degree estimated.
The research method is presented in Figure 2.

3. Results

The 31 sources included for the literature review provided 14 key success factors, grouped into five dimensions according to their similarity and relatedness to a specific aspect of the educational environment. Table 1 shows the dimensions and their related KSF.
Following up on the creation of dimensions, each of the 14 KSF was detailed into 41 constructs. The constructs were detailed into 90 variables that could operationalize objective measurements to assess the maturity of a given implementation. Then, a measurement method was proposed for each variable, as well as an uncertainty degree estimated based on each measurement method. Three measurement methods were proposed:
  • A questionnaire faculty in charge of a course should answer (Lecturer Questionnaire, LQ),
  • Another questionnaire directed to students (Student Questionnaire, SQ), and
  • An external evaluation from a third party not directly involved in the course (External Evaluation, EE).
As a result, Figure 3 shows the Engineering Education Active Learning Maturity Model (E2ALM2) with four levels.:
All dimensions and their KSF are defined in the following sections. Each KSF is detailed with its constructs and variables. Each variable has a measurement method (MM) and uncertainty degree (UD) suggested.

3.1. Content Quality

This dimension concentrates the factors related to the core of the learning process, such as the quality of the problems, projects, or cases studied (artifacts); the level of difficulty required from the students; whether the activities facilitate learning; and whether the evaluation criteria are clear and consistent. The three KSF are detailed below.

3.1.1. Course Artifacts

Course artifacts (problems, projects, or cases studied) should:
  • Engage students with real-life problems and active experiences [62];
  • Provide students with a variety of additional instructional resources, such as simulations, case studies, videos, and demonstrations [62];
  • Be suitable to achieve different targets including the support of the students’ learning process and establishing learning outcomes requirements [53];
  • Be clearly written, in the right length, useful, flexible, and provide an appropriate degree of breath [63];
  • Have suitable intellectual challenge [16,17,18,42,64]; and
  • Begin with an explanation of its purpose [49,65,66].
Table 2 describes the KSF “Course Artifacts” with more detail. Its constructs were derived from the list of requisites presented above. Variables were proposed to measure each construct, as well as the most suitable measurement method (MM) and the uncertainty degree in each measurement.

3.1.2. Student Assessment

Student assessment needs to be clear, concise, and consistent. This involves instructions, assignments, assessments, due dates, course pages, and office hours [62]. Furthermore, criteria for success must be communicated clearly and monitored [18,34,42,67,68,69,70]. Table 3 details the KSF “Student Assessment”.

3.1.3. Learning Facilitation

Learning facilitation includes the preparation of students to conduct activities and tasks required in addition to activities related to the facilitator guiding the learning process of the students [53]. It also involves providing students with regular opportunities for formative feedback from the lecturer [17,18,42,67,70,71]. Table 4 details the following levels of this KSF.

3.2. Organizational Environment

The factors of this dimension represent abstract aspects of the institution, such as culture, policy, and the practice of collecting feedback from students.

3.2.1. Culture

Organizational culture is a set of values systems followed by members of an organization as guidelines for behavior and solving the problems that occur in the organization [72]. This way, an organization and its members should have behavior alignment, and an organization should have guidelines to solve problems. Table 5 details the following levels of this KSF.

3.2.2. Policy

Organization policy is a set of program plans, activities, and actions that allows the prediction of how the organization works and how a problem would be solved [72]. Once time is needed to prepare the activities, teachers must have it for implementing something new in their classes [19]. Table 6 describes more details of this KSF.

3.2.3. Student Feedback

Organizations are expected to collect feedback from students [25,49,73] and provide the support needed to successfully complete the activity [49].
Thus, it is possible to identify three different requirements for organizations carry on successfully this process: having a suitable process of feedback collection, using suitable feedback, and having an adequate student feedback process.
The following levels of the KSF “Student Feedback” are shown in Table 7.

3.2.4. Instructional Design

Brophy [74] and Paechter et al. [75] highlight the importance of the structure and coherence of the curriculum and the learning materials. Thus, it is possible to identify two requirements to this KSF:
  • Curriculum should be suitable to the course needs; and
  • Curriculum and learning material should have coherence with each other.
Table 8 shows the following levels of this KSF.

3.3. Organizational Infrastructure

This dimension contains factors that represent the infrastructure available for course activities.

3.3.1. Classrooms

Classrooms designed for improved Active Learning experience [76] and equipped with technologies can enhance student learning and support teaching innovation [77,78,79,80,81]. Thus, two different requirements emerge for this KSF:
  • Organizations should have appropriate classrooms for Active Learning; and
  • Organizations should provide classrooms with technological support.
Table 9 describes more details of this KSF.

3.3.2. Technology

The school should provide equipment and technological structure [19,42]. This involves availability, reliability, accessibility, usability of devices, internet (Wi-Fi), learning support, and inclusive learning environment [16,42,70,82].
Table 10 shows the details of KSF “Technology”.

3.4. Lecturer

The lecturer is single most important actor in a successfully implementation of Active Learning. This dimension groups factors that represents their knowledge, skills, and attitude to carry out education innovation.

3.4.1. Knowledge

Knowledge is a combination of framed experience, values, and contextual information that provides an environment for evaluating and incorporating new experiences [72]. DeMonbrun et al. highlighted the relevance of experience to lecturer [49]. Therefore, lecturer should have suitable experience as faculty member and information about Active Learning.
Table 11 details the KSF “Knowledge”.

3.4.2. Skills

Skills are the ability to use reason, thoughts, ideas, and creativity in doing, changing, or making things more meaningful so as to produce a value from the results of the work [72]. The lecturer should have skills about educational innovations in general and about Active Learning specifically. Table 12 shows this KSF in detail.

3.4.3. Attitude

Attitude encompasses a very broad range of activities, including how people walk, talk, act, think, perceive, and feel [72]. Hegarty and Thompson [42] highlight the relevance of lecturer attributes and teaching methods, such as approachable, supportive, enthusiastic, and interesting delivery. Table 13 shows the following levels of this KSF.

3.5. Interactions

Placing students at the center of the learning process requires them to step out of the role of recipients of information and become active agents. The interaction between students and between them and teachers allows this transition to happen.

3.5.1. Between Students

Opportunities for students to work together and obtain peer feedback included in the learning design [42]. Chen, Bastedo, and Howard [62] emphasize that the course should provide online and face-to-face opportunities for students to collaborate with others.
Table 14 shows this KSF in detail.

3.5.2. With Lecturers

Interaction between students and lecturer supports knowledge construction, motivation, and the establishment of a social relationship [75]. Furthermore, constructive and enriching feedbacks from the lecturer lead to increasing academic success and feelings of support [42]. Table 15 details this KSF.

3.6. Measurement Scales

Most of E2ALM2 variables are related to the perception of students and teachers. They can be measured on a five-point Likert scale [83], coded as 5: strongly agree; 4: agree; 3: neither agree nor disagree; 2: disagree; and 1: strongly disagree.
The model also involves numerical variables, such as the percentage of activities that define clearly what is expected of the student or the percentage of activities in which the purpose is explained to students. For these variables, it is also possible to use a five-point scale, however with coding based on frequency or ranges, such as 5: always, 4: often, 3: occasionally, 2: rarely, and 1: never.
Finally, there are binary variables, e.g., whether assessment methods are defined in advance.

3.7. KSF Weights

In the proposed model, each dimension has a score independent of the others. Thus, there is no need to define weights for the dimensions. However, it is necessary to define the weight that each KSF has in the composition of the score within its dimension. Two approaches are possible: (i) a uniform distribution inside the dimension and (ii) a distribution according to the relative relevance, based on number of references that support each KSF. Table 16 presents KSF weights under two criteria.

4. Discussion

As explained in the introduction, there is a lack of instruments that can help engineering schools and lecturers assess Active Learning implementations. The use of maturity models can support them in this task.
According to Bruin et al. [84], the maturity assessment can be descriptive, prescriptive, or comparative in nature. A purely descriptive model can be applied for an as-is diagnosis, with no provision for improving maturity or providing relationships with performance. A prescriptive model emphasizes the relationships between variables for final performance and indicates how to approach maturity improvement to positively affect the outcome. Therefore, it allows the development of a roadmap for improvement. A comparative model allows benchmarking across sectors or regions. Thus, it would be possible to compare similar practices between organizations to assess maturity in different sectors.
The E2ALM2 is a descriptive maturity model (according to Bruin et al.’s classification), which can be understood as the first step in a life cycle that will allow the evolution to a prescriptive model. This evolution requires more knowledge about the impact of actions and the identification of replicable actions that support the advance in the maturity level. This difficulty is especially important due to the difference in results obtained in education when different contexts and conditions are compared [43].
Although there are four other maturity models in the field of education, they have a different focus from E2ALM2. These models are focused on: Project-Based Learning (PBLCMM) [53], Student Engagement (SESR-MM) [54], Curriculum Design (CDMM) [55], and e-Learning [56]. In addition to the difference in focus, none of these models provide an assessment of the same requirements and with the scope of E2ALM2. In addition to these four models, there is an extremely simple scale, which is neither a theoretical model with scientific references nor peer-reviewed, but which has a similar objective to assess the use of Active Learning [85].
The E2ALM2 model allows the diagnosis of the current stage of Active Learning implementation with a focus on a course, from the objective measurement of 90 variables. For most variables, the suggested measurement method is a questionnaire for the lecturer, for the student, or for both. This choice aims to facilitate the application of the model in real cases, reducing the need for an external evaluator to observe the activities throughout the entire period to issue its report.
Obviously, collecting impressions through questionnaires introduces the possibility of bias, both for the teacher and the student. Therefore, it will be necessary to use response validation techniques when creating the questionnaires. Because of this possibility of bias, all variables had an estimated uncertainty degree. In cases where the uncertainty degree is high, the statistical validation of answers will need to be stricter. As a way to avoid possible contamination in the results due to bias, some variables are measured by questions asked to both the lecturer and the students.
The use of Active Learning has several positive effects, as explained in the introduction, but there are also some difficulties and limitations. Streveler states that Active Learning is not a solution for all instructional inadequacies [46]. The increasing workload for lecturers [52,86] and students [47], the resistance to changes [43,48,49,50,51], and the need to align curriculum and course activities [86,87] are challenges that need to be overcome in Active Learning implementations.
Furthermore, it is important to emphasize that the E2ALM2 model does not aim to assess the overall quality of an engineering program, but the maturity level of Active Learning implementation, which is a recommendation of the main modernization movements in the Engineering Education field around the world. Courses and schools can still be of a high quality even though they follow a more traditional approach to engineering education. The point here is that whoever wants to modernize their engineering education approach will struggle with the implementation of Active Learning as a pedagogical and cultural element, and the E2ALM2 can shed light for managers and lecturers during the messy times of changes, infrastructural adaptations, and resistance from students and faculty members.
As future work, we recommend: (i) defining further studies to test the scale of each variable; (ii) determining empirical testing of the weights of each KSF in their respective dimensions; (iii) testing the questionnaires to measure all variables; (iv) validation of the framework in different cultural settings, for instance with an international panel of experts; and (v) application of the framework to evaluate the maturity of real cases, which will allow qualitative and quantitative analyses.

5. Conclusions

This study proposed a framework to evaluate the maturity of adoption of Active Learning by a specific course. The variables described here can serve as a checklist to lecturers adopting Active Learning and as a metric to evaluate the comprehensiveness and quality of existing initiatives.
The proposed model is descriptive, because it allows evaluating the current situation, but it can be understood as a first step towards the construction of a prescriptive model, which can indicate good practices and replicable actions to increase the level of maturity.
E2ALM2 was designed so that its application is easy, centered on questionnaires for lecturers and students, without the need for long periods of external observation, which would lead to greater expenses and prevent scalability.
E2ALM2 allows faculty members to assess the current state of Active Learning implementations and therefore compare states before and after planned interventions with specific objectives.
Despite having the focus on a course, the diagnosis of a program or an engineering school can be made as a composition of the evaluations of the courses that comprise it, which also favors managerial actions.

Author Contributions

Writing—original draft, H.A.; Writing—review & editing, É.R.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Beanland, D.; Hadgraft, R. Engineering Education: Transformation and Innovation; RMIT Publishing: Melbourne, Australia, 2013. [Google Scholar]
  2. Graham, R. The Global State of the Art Engineering Education: March 2018; Massachusetts Institute of Technology (MIT): Cambridge, MA, USA, 2018. [Google Scholar]
  3. Crawley, E.F.; Malmqvist, J.; Östlund, S.; Brodeur, D.R.; Edström, K. Rethinking Engineering Education: The CDIO Approach, 2nd ed.; Springer: New York, NY, USA, 2014. [Google Scholar]
  4. CDIO | Worldwide Initiative. Available online: http://cdio.org/about (accessed on 12 October 2021).
  5. Lattuca, L.R.; Terenzini, P.T.; Volkwein, J.F. Engineering Change: A Study of the Impact of EC2000; ABET, Inc.: Baltimore, MD, USA, 2006. [Google Scholar]
  6. Rugarcia, A.; Felder, R.M.; Woods, D.R.; Stice, J.E. The future of engineering education I. A vision of a new century. Chem. Eng. Educ. 2000, 34, 16–25. [Google Scholar]
  7. Goldberg, D.E.; Somerville, M. A Whole New Engineer; ThreeJoy Associates, Inc.: Douglas, MI, USA, 2014. [Google Scholar]
  8. Mohd-yusof, K.; Arsat, D.; Borhan, M.T.B.; de Graaff, E.; Kolmos, A. PBL Across Cultures; Aalborg Universitet: Aalborg, Denmark, 2013. [Google Scholar]
  9. Ramaswamy, S.; Harris, I.; Tschirner, U. Student peer teaching: An innovative approach to instruction in science and engineering education. J. Sci. Educ. Technol. 2001, 10, 165–171. [Google Scholar]
  10. Hartikainen, S.; Rintala, H.; Pylväs, L.; Nokelainen, P. The Concept of Active Learning and the Measurement of Learning Outcomes: A Review of Research in Engineering Higher Education. Educ. Sci. 2019, 9, 276. [Google Scholar] [CrossRef] [Green Version]
  11. Thomas, I. Critical Thinking, Transformative Learning, Sustainable Education, and Problem-Based Learning in Universities. J. Transform. Educ. 2009, 7, 245–264. [Google Scholar] [CrossRef]
  12. Albert, M.; Beatty, B.J. Flipping the Classroom Applications to Curriculum Redesign for an Introduction to Management Course: Impact on Grades. J. Educ. Bus. 2014, 89, 419–424. [Google Scholar] [CrossRef]
  13. Bonwell, C.; Eison, J. Active Learning: Creating Excitement in the Classroom. 1991 ASHE-ERIC Higher Education Reports. 1991; ERIC Publications: Washington, DC, USA, 1991. [Google Scholar]
  14. Barkley, E.F. Student Engagement Techniques: A Handbook for College Faculty, 1st ed.; The Jossey-Bass: San Francisco, CA, USA, 2010. [Google Scholar]
  15. Carini, R.M.; Kuh, G.D.; Klein, S.P. Student engagement and student learning: Testing the linkages. Res. High. Educ. 2006, 47, 1–32. [Google Scholar] [CrossRef]
  16. Kuh, G.D. The National Survey of Student Engagement: Conceptual Framework and Overview of Psychometric Properties; Indiana University Center for Postsecondary Research: Bloomington, IN, USA, 2001. [Google Scholar]
  17. Zepke, N.; Leach, L.; Butler, P. Student engagement: What is it and what influences it. Wellingt. Teach. Learn. Res. Initiat. 2010, 1, 1–22. [Google Scholar]
  18. Zepke, N.; Leach, L. Improving student engagement: Ten proposals for action. Act. Learn. High. Educ. 2010, 11, 167–177. [Google Scholar] [CrossRef]
  19. Hernández-de-Menéndez, M.; Guevara, A.V.; Martínez, J.C.T.; Alcántara, D.H.; Morales-Menendez, R. Active learning in engineering education. A review of fundamentals, best practices and experiences. Int. J. Interact. Des. Manuf. 2019, 13, 909–922. [Google Scholar] [CrossRef]
  20. McLaughlin, J.E.; Roth, M.T.; Glatt, G.M.; Gharkholonarehe, N.; Davidson, C.A.; Griffin, L.M.; Esserman, D.A.; Mumper, R.J. The flipped classroom: A course redesign to foster learning and engagement in a health professions school. Acad. Med. 2014, 89, 236–243. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Slavich, G.M.; Zimbardo, P.G. Transformational Teaching: Theoretical Underpinnings, Basic Principles, and Core Methods. Educ. Psychol. Rev. 2012, 24, 569–608. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Fernandes, S.; Mesquita, D.; Flores, M.A.; Lima, R.M. Engaging students in learning: Findings from a study of project-led education. Eur. J. Eng. Educ. 2013, 39, 55–67. [Google Scholar] [CrossRef]
  23. Prince, M. Does Active Learning Work? A Review of the Research. J. Eng. Educ. 2004, 93, 223–231. [Google Scholar] [CrossRef]
  24. Roehl, A.; Reddy, S.L.; Shannon, G.J. The Flipped Classroom: An Opportunity To Engage Millennial Students Through Active Learning Strategies. J. Fam. Consum. Sci. 2013, 105, 44–49. [Google Scholar] [CrossRef]
  25. Yadav, A.; Subedi, D.; Lundeberg, M.A.; Bunting, C.F. Problem-based learning: Influence on students’ learning in an elec-trical engineering course. J. Eng. Educ. 2011, 100, 253–280. [Google Scholar] [CrossRef]
  26. Edström, K.; Kolmos, A. PBL and CDIO: Complementary models for engineering education development. Eur. J. Eng. Educ. 2014, 39, 539–555. [Google Scholar] [CrossRef]
  27. Felder, R.M.; Silverman, L.K. Learning and Teaching Styles in Engineering Education. Eng. Educ. 1988, 78, 674–681. [Google Scholar]
  28. Hoidn, S.; Kärkkäinen, K. Promoting Skills for Innovation in Higher Education. A Literature-Review on the Effectiveness of Problem-based Learning and of Teaching Behaviours; OECD Education Working Papers, No. 100; OECD Publishing: Paris, France, 2014. [Google Scholar]
  29. Chong, J.L.; Benza, R. Teaching Innovation Skills. Bus. Educ. Innov. J. 2015, 7, 43–50. [Google Scholar]
  30. Rodriguez-Triana, M.J.; Prieto, L.P.; Holzer, A.; Gillet, D. Instruction, Student Engagement, and Learning Outcomes: A Case Study Using Anonymous Social Media in a Face-to-Face Classroom. IEEE Trans. Learn. Technol. 2020, 13, 718–733. [Google Scholar] [CrossRef]
  31. Holbert, K.; Karady, G.G. Strategies, Challenges and Prospects for Active Learning in the Computer-Based Classroom. IEEE Trans. Educ. 2008, 52, 31–38. [Google Scholar] [CrossRef]
  32. Carr, R.; Palmer, S.; Hagel, P. Active learning: The importance of developing a comprehensive measure. Act. Learn. High. Educ. 2015, 16, 173–186. [Google Scholar] [CrossRef] [Green Version]
  33. Salaber, J. Facilitating student engagement and collaboration in a large postgraduate course using wiki-based activities. Int. J. Manag. Educ. 2014, 12, 115–126. [Google Scholar] [CrossRef] [Green Version]
  34. Chapman, E. Alternative approaches to assessing student engagement rates. Pract. Assess. Res. Eval. 2003, 8, 2002–2003. [Google Scholar]
  35. Bolton, K.; Saalman, E.; Christie, M.; Ingerman, Å.; Linder, C. SimChemistry as an active learning tool in chemical educa-tion. Chem. Educ. Res. Pract. 2008, 9, 277–284. [Google Scholar] [CrossRef]
  36. Burke, A.S.; Fedorek, B. Does ‘flipping’ promote engagement?: A comparison of a traditional, online, and flipped class. Act. Learn. High. Educ. 2017, 18, 11–24. [Google Scholar] [CrossRef]
  37. Van Alten, D.C.; Phielix, C.; Janssen, J.; Kester, L. Effects of flipping the classroom on learning outcomes and satisfaction: A meta-analysis. Educ. Res. Rev. 2019, 28, 100281. [Google Scholar] [CrossRef]
  38. Mori, T. The Flipped Classroom: An Instructional Framework for Promotion of Active Learning. In Deep Active Learning; Matsushita, K., Ed.; Springer: Singapore, 2017; pp. 95–109. [Google Scholar] [CrossRef]
  39. Howell, R.A. Engaging students in Education for Sustainable Development: The benefits of active learning, reflective prac-tices and flipped classroom pedagogies. J. Clean. Prod. 2021, 325, 129318. [Google Scholar] [CrossRef]
  40. Van Amburgh, J.A.; Devlin, J.W.; Kirwin, J.L.; Qualters, D.M. A Tool for Measuring Active Learning in the Classroom. Am. J. Pharm. Educ. 2007, 71, 85. [Google Scholar] [CrossRef] [Green Version]
  41. Freeman, S.; Eddy, S.L.; McDonough, M.; Smith, M.K.; Okoroafor, N.; Jordt, H.; Wenderoth, M.P. Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. USA 2014, 111, 8410–8415. [Google Scholar] [CrossRef] [Green Version]
  42. Hegarty, B.; Thompson, M. A teacher’s influence on student engagement: Using smartphones for creating vocational as-sessment ePortfolios. J. Inf. Technol. Educ. Res. 2019, 18, 113–159. [Google Scholar]
  43. Ito, H.; Kawazoe, N. Active Learning for Creating Innovators: Employability Skills beyond Industrial Needs. Int. J. High. Educ. 2015, 4, 81. [Google Scholar] [CrossRef] [Green Version]
  44. Lizzio, A.; Wilson, K. Action learning in higher education: An investigation of its potential to develop professional capa-bility. Stud. High. Educ. 2004, 29, 469–488. [Google Scholar] [CrossRef]
  45. Baepler, P.; Walker, J.D.; Driessen, M. It’s not about seat time: Blending, flipping, and efficiency in active learning class-rooms. Comput. Educ. 2014, 78, 227–236. [Google Scholar] [CrossRef]
  46. Streveler, R.A.; Menekse, M. Taking a Closer Look at Active Learning. J. Eng. Educ. 2017, 106, 186–190. [Google Scholar] [CrossRef]
  47. Fernandes, S.R.G. Preparing Graduates for Professional Practice: Findings from a Case Study of Project-based Learning (PBL). Proc.—Soc. Behav. Sci. 2014, 139, 219–226. [Google Scholar] [CrossRef] [Green Version]
  48. Borrego, M.; Nguyen, K.A.; Crockett, C.; DeMonbrun, M.; Shekhar, P.; Tharayil, S.; Finelli, C.J.; Rosenberg, R.S.; Waters, C. Systematic Literature Review of Students’ Affective Responses to Active Learning: Overview of Results. In Proceedings of the IEEE Frontiers in Education Conference (FIE), San Jose, CA, USA, 3–6 October 2018; pp. 1–7. [Google Scholar] [CrossRef]
  49. DeMonbrun, M.; Finelli, C.J.; Prince, M.; Borrego, M.; Shekhar, P.; Henderson, C.; Waters, C. Creating an Instrument to Measure Student Response to Instructional Practices. J. Eng. Educ. 2017, 106, 273–298. [Google Scholar] [CrossRef] [Green Version]
  50. Bicknell-Holmes, T.; Hoffman, P.S. Elicit, engage, experience, explore: Discovery learning in library instruction. Ref. Serv. Rev. 2000, 28, 313–322. [Google Scholar] [CrossRef] [Green Version]
  51. Andrews, M.; Prince, M.; Finelli, C.; Graham, M.; Borrego, M.; Husman, J. Explanation and Facilitation Strategies Reduce Student Resistance to Active Learning. Coll. Teach. 2021, 1–11. [Google Scholar] [CrossRef]
  52. Alves, A.C.; Sousa, R.; Moreira, F.; Carvalho, M.A.; Cardoso, E.; Pimenta, P.; Malheiro, T.; Brito, I.; Fernandes, S.R.G.; Mesquita, D. Managing PBL difficulties in an industrial engineering and management program. J. Ind. Eng. Manag. 2016, 9, 586–611. [Google Scholar] [CrossRef] [Green Version]
  53. Al Mughrabi, A.; Jaeger, M. Using a Capability Maturity Model in Project Based Learning. Eur. J. Eng. Educ. 2016, 94–107. [Google Scholar]
  54. Nelson, K.J.; Clarke, J.A.; Stoodley, I.; Creagh, T. Using a Capability Maturity Model to build on the generational approach to student engagement practices. High. Educ. Res. Dev. 2014, 34, 351–367. [Google Scholar] [CrossRef] [Green Version]
  55. Thong, C.L.; Yusmadi, Y.J.; Rusli, A.; Hayati, A.N. Applying capability maturity model to curriculum design: A case study at private institution of higher learning in Malaysia. Lect. Notes Eng. Comput. Sci. 2012, 2198, 1070–1075. [Google Scholar]
  56. Marshall, S. New Zealand Tertiary Institution e-Learning Capability: Informing and Guiding e-Learning Architectural Change and Development; New Zealand Ministry of Education: Wellington, New Zealand, 2006; Volume 19, 118p. [Google Scholar]
  57. Maier, A.; Moultrie, J.; Clarkson, J. Assessing organizational capabilities: Reviewing and guiding the development of maturity grids. IEEE Trans. Eng. Manag. 2020, 59, 138–159. [Google Scholar] [CrossRef]
  58. Klimko, G. Knowledge management and maturity models: Building common understanding. In Proceedings of the 2nd European Conference on Knowledge Management, Bled, Slovenia, 8–9 November 2001; Volume 2, pp. 269–278. [Google Scholar]
  59. Kohlegger, M.; Maier, R.; Thalmann, S. Understanding maturity models results of a structured content analysis. In Proceedings of the 9th International Conference on Knowledge Management and Knowledge Technologies (I-KNOW ’09 and I-SEMANTICS ’09), Graz, Austria, 2–4 September 2009; pp. 51–61. [Google Scholar]
  60. Shamseer, L.; Moher, D.; Clarke, M.; Ghersi, D.; Liberati, A.; Petticrew, M.; Shekelle, P.; Stewart, L.A.; The PRISMA-P Group. Preferred reporting items for systematic review and meta-analysis protocols (prisma-p) 2015: Elabora-tion and explanation. BMJ 2015, 349, 1–25. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  61. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Int. J. Surg. 2010, 8, 336–341. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  62. Chen, B.; Bastedo, K.; Howard, W. Exploring design elements for online STEM courses: Active learning, engagement & assessment design. Online Learn. J. 2018, 22, 59–76. [Google Scholar]
  63. Shee, D.Y.; Wang, Y.-S. Multi-criteria evaluation of the web-based e-learning system: A methodology based on learner satisfaction and its applications. Comput. Educ. 2008, 50, 894–905. [Google Scholar] [CrossRef]
  64. Evans, C.; Mujis, D.; Tomlinson, D. 2015. Available online: https://www.advance-he.ac.uk/knowledge-hub/engaged-student-learning-high-impact-strategies-enhance-student-achievement (accessed on 24 October 2021).
  65. Bacon, D.R.; Stewart, K.A. Lessons From the Best and Worst Team Experiences: How a Teacher Can Make the Difference: Reflections and Recommendations for Student Teams Researchers. J. Manag. Educ. 2019, 43, 543–549. [Google Scholar] [CrossRef]
  66. Donohue, S.K.; Richards, L.G. Factors affecting student attitudes toward active learning activities in a graduate engineer-ing statistics course. In Proceedings of the 39th IEEE International Conference on Frontiers in Education Conference, San Antonio, TX, USA, 18 October 2009; pp. 1–6. [Google Scholar]
  67. Cochrane, T.; Antonczak, L. Connecting the theory and practice of mobile learning: A framework for creative pedagogies using mobile social media. Media Educ. 2015, 6, 248–269. [Google Scholar]
  68. Hattie, J.A.C.; Donoghue, G. Learning strategies: A synthesis and conceptual model. NPJ Sci. Learn. 2016, 1, 16013. [Google Scholar] [CrossRef]
  69. Rutherford, C. Using Online Social Media to Support Preservice Student Engagement. MERLOT J. Online Learn. Teach. 2010, 6, 703–711. [Google Scholar]
  70. Tarantino, K.; Mcdonough, J.; Hua, M. Effects of Student Engagement with Social Media on Student Learning: A Review of Literature. J. Technol. Stud. Aff. 2013, 1, 1–13. [Google Scholar]
  71. Mesquita, I.; Coutinho, P.; De Martin-Silva, L.; Parente, B.; Faria, M.; Afonso, J. The Value of Indirect Teaching Strategies in Enhancing Student-Coaches’ Learning Engagement. J. Sports Sci. Med. 2015, 14, 657–668. [Google Scholar]
  72. Priatna, T.; Maylawati, D.S.; Sugilar, H.; Ramdhani, M.A. Key Success Factors of e-Learning Implementation in Higher Education. Int. J. Emerg. Technol. Learn. (iJET) 2020, 15, 101–114. [Google Scholar] [CrossRef]
  73. Francoise, B.; K Semsar, K.; Kennedy, S. How Not to Lose Your Students with Concept Maps. J. Coll. Sci. Teach. 2011, 41, 61–68. [Google Scholar]
  74. Brophy, J.E. Teaching; International Academy of Education: Geneva, Switzerland; International Bureau of Education: Geneva, Switzerland, 1999. [Google Scholar]
  75. Paechter, M.; Maier, B.; Macher, D. Students’ expectations of, and experiences in e-learning: Their relation to learning achievements and course satisfaction. Comput. Educ. 2010, 54, 222–229. [Google Scholar] [CrossRef]
  76. Park, E.L.; Choi, B.K. Transformation of classroom spaces: Traditional versus active learning classroom in colleges. High. Educ. 2014, 68, 749–771. [Google Scholar] [CrossRef]
  77. Charles, E.S.; Whittaker, C.; Dugdale, M.; Guillemette, J. College level active learning classrooms: Challenges of using the heterogeneous ecology. In Proceedings of the Orchestrated Collaborative Classroom Workshop, Gothenburg, Sweden, 7 June 2015; Volume 1411, pp. 39–44. [Google Scholar]
  78. Chiu, P.H.P.; Cheng, S.H. Effects of active learning classrooms on student learning: A two-year empirical investigation on student perceptions and academic performance. High. Educ. Res. Dev. 2016, 36, 269–279. [Google Scholar] [CrossRef]
  79. Chiu, P.H.P.; Lai, K.W.C.; Fan, T.K.F.; Cheng, S.H. A pedagogical model for introducing 3D printing technology in a freshman level course based on a classic instructional design theory. In Proceedings of the IEEE Frontiers in Education Conference (FIE), El Paso, TX, USA, 21–24 October 2015; pp. 1–6. [Google Scholar] [CrossRef]
  80. Dori, Y.J.; Belcher, J. How does technology-enabled active learning affect undergraduate students’ understanding of elec-tromagnetism concepts? J. Learn. Sci. 2005, 14, 243–279. [Google Scholar] [CrossRef]
  81. Soderdahl, P. Library classroom renovated as an active learning classroom. Libr. Hi Tech 2011, 29, 83–90. [Google Scholar] [CrossRef] [Green Version]
  82. AUSSE. Australasian Survey of Student Engagement; Australasian Survey of Student Engagement: Camberwell, VIC, Australia, 2010. [Google Scholar]
  83. Likert, R. A technique for the measurement of attitudes. Arch. Psychol. 1932, 22, 55. [Google Scholar]
  84. De Bruin, T.; Rosemann, M.; Freeze, R.; Kulkarni, U. Understanding the main phases of developing a maturity assessment model. In Proceedings of the 16th Australasian Conference on Information Systems, Sydney, Australia, 29 November–2 December 2005. [Google Scholar]
  85. Shaping EDU (Arizona State University). Active Learning in Digital Realms: Capability Maturity Model. 2019. Available online: https://shapingedu.asu.edu/active-learning-digital-realms (accessed on 18 October 2021).
  86. Lima, R.M. et al. A project management framework for planning and executing interdisciplinary learning projects in engi-neering education. In Project Approaches to Learning in Engineering Education; Brill Sense: Leiden, The Netherlands, 2012; pp. 53–76. [Google Scholar]
  87. Fernandes, S.; Abelha, M.; Albuquerque, A.S.; Sousa, E. Curricular and pedagogic innovation in a social education pro-gramme: Findings from the implementation of PBL. Int. Symp. Proj. Approaches Eng. Educ. 2011, 10, 375–384. [Google Scholar]
Figure 1. Source selection process (N = 31).
Figure 1. Source selection process (N = 31).
Education 11 00690 g001
Figure 2. Research procedures.
Figure 2. Research procedures.
Education 11 00690 g002
Figure 3. E2ALM2.
Figure 3. E2ALM2.
Education 11 00690 g003
Table 1. Dimensions, KSF, and references.
Table 1. Dimensions, KSF, and references.
DimensionKSFReferences
Content qualityCourse artifacts[16,17,18,42,49,53,62,63,64,65,66]
Student assessment[18,34,42,62,67,68,69,70]
Learning facilitation[17,18,42,53,67,70,71]
Organizational environmentCulture[72]
Policy[19,72]
Student feedback[25,49,73]
Instructional design[74,75]
Organizational infrastructureClassrooms[76,77,78,79,80,81]
Technology[16,19,42,70,82]
LecturerKnowledge[49,72]
Skills[72]
Attitude[42,72]
InteractionsBetween students[42,62]
With lecturers[42,75]
Table 2. KSF “Course Artifacts”.
Table 2. KSF “Course Artifacts”.
ConstructVariableMMUD
Use of real-life problems% of course content based on real-life problemsSQ and LQMedium
Application of active experiments% of classes using active methodsSQ and LQMedium
Students’ perception of hands-on activitiesSQLow
Variety of instructional resourcesQuantity of instructional resources usedSQ and LQLow
% of classes using resources other than the board or projectorSQ and LQLow
Students’ perception of the use of various resourcesSQLow
Adequacy to learning outcomes (LO)% of classes linked directly to an LOSQ and LQMedium
Students’perception of reaching an LOSQMedium
Suitability of intellectual challengeStudents’ perception of the level of difficulty presentedSQLow
Clarity in writing of course activitiesStudents’ perception of the clarity usedSQLow
Size of course activitiesStudents’ perception of sizeSQLow
Explanation of purpose of course activitiesStudents’ perception of clarity in the purpose of the activitiesSQLow
% of activities in which the purpose is explained to studentsSQLow
Table 3. KSF “Student Assessment”.
Table 3. KSF “Student Assessment”.
ConstructVariableMMUD
Clearness of assessment methodsPerception of students on the clarity of assessment methodsSQLow
Are the assessment methods defined in advance?SQ and LQLow
% of activities that have defined what is expected of the studentSQ and LQLow
Clearness of criteria for successPerception of students on the clarity of success criteriaSQLow
Are the success criteria defined in advance?SQ and LQLow
Communications with studentsIs information about assessment methods and success criteria made available before (or at the beginning of) the course?SQ and LQLow
Students’ perception of communication of assessment methods and success criteriaSQLow
Table 4. KSF “Learning Facilitation”.
Table 4. KSF “Learning Facilitation”.
ConstructVariableMMUD
Preparation of students to conduct activities required% of activities flagged as supporting another activitySQ and LQMedium
Students’ perception of the existing preparation for conducting activitiesSQMedium
Students’ perception of the teacher’s performance as a facilitatorSQHigh
Intensity of the participation of monitors or auxiliary teachers during the courseSQ and LQMedium
Formative feedback from teacher% of activities where there is formative feedback from the teacherSQMedium
Students’ perception of the intensity of support received via formative feedbackSQMedium
Table 5. KSF “Culture”.
Table 5. KSF “Culture”.
ConstructVariableMMUD
Acceptance of changes by the organizationEase of approval of pedagogical changesLQMedium
Ease of approval of administrative changesLQMedium
Behavior alignmentClarity of expected behaviorsLQMedium
Existence (or maturity) of behavioral guidelinesLQLow
Ability to solve problemsPerception of the speed with which problems are solvedLQLow
Perception of transparency in problem solvingLQLow
Defining rulesExistence (or maturity) of a code of ethicsLQLow
Adequacy to the rulesPerception of the existence of punishments for those who violate certain rulesLQMedium
Table 6. KSF “Policy”.
Table 6. KSF “Policy”.
ConstructVariableMMUD
Organizational support for the preparation of activitiesPerception of the existence of time available for planning new activitiesLQLow
% average of teachers’ time in classroom activitiesLQLow
% average of teachers’ time in administrative activitiesLQLow
Average amount of administrative functions performed by teachersLQLow
Perception of the availability of auxiliary resources for the preparation of activitiesLQLow
Adequacy of pedagogical plansPerception of the adequacy of existing teaching plans to the use of ALLQMedium
Table 7. KSF “Student Feedback”.
Table 7. KSF “Student Feedback”.
ConstructVariableMMUD
Collecting student feedbackExistence (or maturity) of the process of receiving feedback from studentsSQLow
Using student feedbackPerception of students on the fulfilment of their placements in feedbacksSQMedium
Number of objective actions resulting from student feedback in the last yearsEELow
Quality of the student feedback Is feedback anonymous?SQLow
Is the collection in person or remote?SQLow
Perception of students about the ease of the process of giving feedbackSQLow
Table 8. KSF “Instructional Design”.
Table 8. KSF “Instructional Design”.
ConstructVariableMMUD
Structure of the curriculumPerception about the adequacy of the curriculum to the needs of the courseSQHigh
Coherence of the curriculum and the learning materialStudent perception of the alignment of the curriculum with the course materialSQMedium
Table 9. KSF “Classrooms”.
Table 9. KSF “Classrooms”.
ConstructVariableMMUD
Classrooms designed for improve Active Learning experienceExistence of classrooms for Active LearningSQ and LQLow
Classroom availability for Active LearningSQ and LQLow
% of activities performed in an environment suitable for Active LearningSQ and LQMedium
Classrooms equipped with technologies to enhance student learning and support teaching innovationExistence of classrooms equipped with multimedia devices and/or laboratoriesSQ and LQLow
Availability of classrooms equipped with multimedia devices and/or laboratoriesSQ and LQLow
% of activities performed in a technologically appropriate environmentSQ and LQLow
Table 10. KSF “Technology”.
Table 10. KSF “Technology”.
ConstructVariableMMUD
Availability of technologyAvailability of multimedia devicesSQ and LQLow
Internet availability on campusSQ and LQLow
Availability of e-learning systemSQ and LQLow
Reliability of technologyReliability of multimedia devicesSQ and LQMedium
On-campus internet reliabilitySQ and LQLow
Reliability of e-learning systemSQ and LQMedium
Accessibility of technologyAccessibility of multimedia devicesSQ and LQMedium
On-campus internet accessibilitySQ and LQLow
Accessibility of e-learning systemSQ and LQLow
Usability of technologyUsability of multimedia devicesSQ and LQMedium
Campus internet usabilitySQ and LQLow
Usability of e-learning systemSQ and LQMedium
Table 11. KSF “Knowledge”.
Table 11. KSF “Knowledge”.
ConstructVariableMMUD
ExperienceActivity time as a lecturerLQLow
Highest academic titleLQLow
Time since the highest titrationLQLow
Contextual informationLevel of knowledge about Active LearningLQHigh
Table 12. KSF “Skills”.
Table 12. KSF “Skills”.
ConstructVariableMMUD
Skills about Active LearningAmount of participation in Active Learning eventsLQLow
Number of books read on Active LearningLQLow
Amount of Active Learning techniques over which you have masteryLQLow
Skills about educational innovationsAmount of participation in events on educational innovationsLQLow
Number of books read on educational innovationsLQLow
Table 13. KSF “Attitude”.
Table 13. KSF “Attitude”.
ConstructVariableMMUD
Willingness to adopt Active Learning techniquesQualitative perception of dispositionEEHigh
Number of periods in which adoption was attemptedLQLow
Number of subjects in which adoption was attemptedLQLow
Time since last adoption attemptLQLow
DemographicsAgeLQLow
Current positionLQLow
Study areaLQLow
Table 14. KSF “Interactions between Students”.
Table 14. KSF “Interactions between Students”.
ConstructVariableMMUD
Interactions in generalQuantity of work/projects carried out in group in the courseSQMedium
% of the grade of the discipline from group workSQMedium
Online collaborationNumber of remote meetings with other students throughout the courseSQLow
Number of online presentations made by the student with assistance from other studentsSQLow
Face-to-face collaborationNumber of face-to-face meetings with other students throughout the courseSQLow
Number of face-to-face presentations made by the student with the assistance of other studentsSQLow
Table 15. KSF “Interactions with Lecturers”.
Table 15. KSF “Interactions with Lecturers”.
ConstructVariableMMUD
Interactions students/professorsNumber of orientation meetings throughout the courseSQ and LQLow
Number of meetings to monitor projects throughout the courseSQ and LQLow
Table 16. KSF Weights.
Table 16. KSF Weights.
DimensionKSF(i) Uniform DistributionNumber of References(ii) Relative Relevance
Content quality
(references = 26)
Artifacts0.33110.42
Student Assessment0.3380.31
Learning Facilitation0.3370.27
Organizational environment
(references = 8)
Culture0.2510.13
Policy0.2520.25
Student Feedback0.2530.38
Instructional Design0.2520.25
Organizational infrastructure (references = 11)Classrooms0.5060.55
Technology0.5050.45
Lecturer
(references =5)
Knowledge0.3320.40
Skills0.3310.20
Attitude0.3320.40
Interactions
(references = 4)
Between students0.5020.50
With lecturers0.5020.50
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Arruda, H.; Silva, É.R. Assessment and Evaluation in Active Learning Implementations: Introducing the Engineering Education Active Learning Maturity Model. Educ. Sci. 2021, 11, 690. https://doi.org/10.3390/educsci11110690

AMA Style

Arruda H, Silva ÉR. Assessment and Evaluation in Active Learning Implementations: Introducing the Engineering Education Active Learning Maturity Model. Education Sciences. 2021; 11(11):690. https://doi.org/10.3390/educsci11110690

Chicago/Turabian Style

Arruda, Humberto, and Édison Renato Silva. 2021. "Assessment and Evaluation in Active Learning Implementations: Introducing the Engineering Education Active Learning Maturity Model" Education Sciences 11, no. 11: 690. https://doi.org/10.3390/educsci11110690

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop