Next Article in Journal
Design and Operating Mode Study of a New Concept Maglev Car Employing Permanent Magnet Electrodynamic Suspension Technology
Next Article in Special Issue
Comparing Face-to-Face, Emergency Remote Teaching and Smart Classroom: A Qualitative Exploratory Research Based on Students’ Experience during the COVID-19 Pandemic
Previous Article in Journal
Spatial–Temporal Heterogeneity and the Related Influencing Factors of Tourism Efficiency in China
Previous Article in Special Issue
A Methodology to Study the University’s Online Teaching Activity from Virtual Platform Indicators: The Effect of the Covid-19 Pandemic at Universitat Politècnica de Catalunya
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mixed Analysis of the Flipped Classroom in the Concrete and Steel Structures Subject in the Context of COVID-19 Crisis Outbreak. A Pilot Study

La Salle Campus Barcelona, Ramon Llull University, GRETEL, 08022 Barcelona, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(11), 5826; https://doi.org/10.3390/su13115826
Submission received: 1 April 2021 / Revised: 14 May 2021 / Accepted: 18 May 2021 / Published: 21 May 2021
(This article belongs to the Special Issue Information Systems, E-learning and Knowledge Management)

Abstract

:
A sudden lockdown was declared on 14 March 2020 due to COVID-19 crisis, leading to an immediate change from face-to-face to online learning in all universities within Spanish jurisdiction. At La Salle School of Architecture, the Concrete and Steel Structures subject started online classes immediately after the lockdown law was published, using a methodology based on the flipped classroom approach and adapting the monitoring of the student to the virtual environment. This article presents a pilot study to analyze the adaptation of the model to the online format using a mixed approach in which qualitative and quantitative surveys were conducted at the end of the course with 48 participants. Responses from both surveys were organized according to six categories (teachers, assessment, methods, class development, students and documents) and 14 subcategories, as developed in an undergoing research project involving the subject since the academic year 2017/2018. Thus, the open responses of the students have been analyzed alongside with the quantitative data. The results demonstrate a proper adaptation of the model, as well as the negative perception of the students of the online format due to the loss of face-to-face benefits of the flipped classroom.

1. Introduction

This article presents a pilot study on the sudden adaptation of the flipped classroom (FC) model from a face-to-face to an online format in the Concrete and Steel Structures subject of the undergraduate studies at La Salle School of Architecture in Barcelona [1], at Ramon Llull University. The abrupt change was caused by the lockdown due to the COVID-19 crisis that was declared on Saturday 14 March 2020 in all universities within Spanish jurisdiction [2]. The next Monday, 16 March, courses that were completely face-to-face just one week before re-started online.
The study presented in this article departs from a broader research started in 2016 that focuses on the continuous improvement of learning through learning analytics and the formative assessment of learning [3,4]. Data is obtained from quantitative and qualitative surveys conducted regularly, while information about the user experience is obtained permanently from the learning management systems used. In this context of learning analytics, collected data is analyzed and used to improve the course design acting on the weaknesses detected.
The aim of the present pilot study is to assess the adaptation of the flipped learning (FL) model used when the COVID-19 crisis suddenly emerged. With this purpose, the following two research questions have been stated, to be answered using the data obtained thanks to the ongoing research in the subject:
  • Has the FL model used in the Concrete and Steel Structures subject adapted properly from a face-to-face format to the online format?
  • Do the students of the Concrete and Steel Structures subject negatively perceive the change to the online format?
It is known that the FC works to enhance the quality of class time, reallocating lectures out of class time [5], usually with the help of learning material and videos uploaded in learning management systems [6]. Therefore, in the context of FL research, it is relevant to analyze this adaptation and the perception of the students when being suddenly forced to leave the face-to-face class and take the rest of the course in the online format.
Furthermore, this study aims to contribute to the lack of research on learning assessment in architecture undergraduate studies, especially when applying active learning in the science, technology, engineering and mathematics (STEM) subjects such as structures.
The ongoing research in the Concrete and Steel Structures subject, and specifically in the FL analysis, is linked to the previous research in the field of FL, as it is following the future lines recommended by Karabulut-Ilgu, Jaramillo and Jahren in “A systematic review of research on the flipped learning method in engineering education” [7]: (a) reforming engineering education through theoretically sound frameworks, (b) using of qualitative and longitudinal data to provide deeper understanding, (c) investigating systematic adoption of FL in engineering education, and (d) shifting the focus from academic skills to professional skills.

2. Theoretical Framework

2.1. Architecture; Art and Science

Architecture is a synthesis of art and science [8,9,10,11], but the balance between these two terms has not historically been the same [12]. De Architectura libri decem (Ten Books of Architecture [13]) written by Marcus Vitruvius in the 1st century BC is not only the oldest known architecture treaty but also the reference book for establishing the meaning of architecture [14] and supports that architecture is art and science. The Vitruvian theory also asserts that there are three architectural virtues, three interrelated qualities located on an equilateral triangle vertex: firmitas, utilitas and venustas (i.e., structural stability, appropriate spatial accommodation, and attractive appearance). Through the years, the Vitruvian Triad was widely discussed with these three interrelated terms stated together.
However, in 18th century, the equilibrium of this triad was altered by giving primacy to venustas. Thus, the Architecture (with A) essential prerequisite should be beauty, and the construction (i.e., firmitas and utilitas) were subordinated in the theory and development of buildings [15,16]. On the other hand, after 1800 when engineers began creating structures, the concept of venustas had no value in front of the laws of gravity. What is more, firmitas, the structural dimension of the triad, become more and more complex with the emergence of new materials and construction systems, new technologies, new structural calculation methods and new typologies of building [17,18].
Today, even as architecture theoretically embraces the technical dimensions worldwide, a deep knowledge of structures is not considered to be crucial in most countries. Thus, architects have different professional attributions in each country, and that has led to different teaching approaches and school curricula, especially on the competences required on technical subjects [19]. Most European countries do not require deep structural knowledge for architecture students, for they do not have professional attributions, while in other countries, like Spain, architects are considered to be the last responsible of building structures [20]. Even though these differences exist, European Directive 2005/36/CE [21] establishes automatic recognition of the Architect title in all EU countries, with the only condition of the possession of an Architecture title obtained in any state member of the EU, with no mention of the difference in professional attributions for each state. Therefore, the architectural curriculum in the Spanish system [22] must keep the proper balance between art and science, design and technology and building [23], and architecture study programs include a complete training in structures [24]: design and calculation, in coherence with architectural concept, and execution surveying. With some differences between the different architecture schools, all curricula include subjects on mathematics and physics, on resistance of materials and on application of current structural codes for concrete and steel structures.

2.2. Active Learning and Flipped Learning

Active learning is anything that “involves students in doing things and thinking about the things they are doing” [25] and can be defined as classroom based activities designed to engage students in their learning through answering questions, solving problems, discussing content or teaching others, individually or in groups [26]. Despite the effectiveness of active learning that has been established by several studies [27], and specifically in the science, technology, engineering and mathematics (STEM) subjects [28,29], traditional teaching methods are still dominant in undergraduate courses [30,31].
Notwithstanding, active learning has constantly been used in Architecture undergraduate studies, mainly in the projects subjects, which have traditionally applied project-based learning (PBL) [32], at least since the 19th century (e.g., at the Ecole des Beaux-Arts in Paris [33]), as well as in design studios [34] where PBL and problem-based learning [35] are used for knowledge integration from several disciplines.
In this context, FL is an active learning strategy that appears to be particularly well suited to engineering education [7] because of its potential to “combine learning theories once thought to be incompatible—active, problem-based learning activities founded upon a constructivist ideology and instructional lectures derived from direct instruction methods founded upon behaviourist principles” [36]. FL uses face-to-face class time to solve complex exercises where students can interact with the teacher and other students to settle the theoretical knowledge previously acquired outside the classroom through video watching or readings. It can be said that FL is a blended learning model that combines face-to-face and e-learning.
FL implementation is flourishing in the academic world at all educational levels with different learning designs [26,37]. Research in FL show multiple benefits such as flexibility for the students [38], an increase in motivation [39,40], increased participation [41], improvement in interaction between students [42,43] and between students and teachers [44,45], in student engagement [46], in professional skills and in academic results [47], among others.
On the other hand, research also shows FC challenges and limitations, such as increased workload for the faculty, student resistance [48], decreased interest, neglected material and problems for students with low capacity for abstraction, poor problem solving or with difficulties in using learning platforms [49].

2.3. Learning Assessment

The assessment of learning is generally conveyed through two approaches, the formative and the summative [50]. The main objective of the summative assessment is to evaluate the student’s learning at the end of the didactic unit, project, or assessment activity corresponding to a specific syllabus. Summative assessment provides teachers with information on the level of achievement of learning content or competencies. In contrast, the main goal of the formative assessment is to feed the instructional model to meet the needs of students. Formative assessment is used to check students’ level of understanding and plan the most appropriate learning instructional design to guide teachers throughout the learning process.
Formative assessment helps to (a) identify strengths and weaknesses, (b) report objectively with data, (c) carry out improvement actions and (d) detect and advance actions to resolve problematic situations in time and provides a database approach that can help to analyze the information. The Big Data movement integrated into education has transformed education from a dual technological and methodological perspective [51]. On the one hand, the adoption of educational technology has a strong analytical and interaction-gathering aspect. On the other hand, this integration is accompanied by a change in the way we do and make decisions based on data. The adoption of analytical methodologies goes beyond Educational Data Mining [52]. At the academic level, the Academic Analytics approach [53] is applied as a process to improve management. At the level of the teaching-learning process, the Learning Analytics (LA) approach [54,55] is applied with the ultimate goal of improving the educational context through the analysis of data in a five-phase cycle [56] (definition of improvement objectives, data collection, analysis and visualization of results, improvement actions, and iterative evaluation) that allows obtaining relevant information from students’ interactions in digital educational technology.
The relationship between FC, formative assessment and LA [57,58], as FC requires a continuous flow of information for content generation, class preparation, and student support, and formative assessment allows to extract this information continuously. This approach offers a firm data frame for the decision-making in the application of the FC methodology [40] and also for the own evaluation of its implantation.
Evaluation methods in learning research are typically quantitative, qualitative or mixed [59]. The method to be used depends on the type of data to be collected and the characteristics of the sample [60]. Both methods are scientifically valid, the results obtained are comparable and both methods are similar in terms of citation [61].
Quantitative studies are based on numeric data analysis intended to validate the phenomenon being studied. These methods usually work with a small number of variables and statistical analysis is usually required to check the reliability of the results. Data is commonly obtained from surveys with questions that have to be answered by the students in a Likert scale [62]. Quantitative methods are limited because the numerical descriptions provided give no details about the students’ perceptions or motivations.
On the other hand, the main purpose of qualitative methods is to get a better understanding of the human perspective of the sample, and detailed descriptions of specific items of the experiment are obtained.
While quantitative studies are usually focused to find the cause [63], qualitative studies are mainly explanatory [64,65]. The relationship between cause and effect using a control group is usual in quantitative research [66,67], while qualitative research could be phenomenological [68], ethnographic [69], theory-generation focused [70], etc. and it does not have to be necessary to have a control group. There are prospective [71], observational [72] or pilot [73] cohort studies in which the study does not have experimental manipulation but follows participants in the learning context by subjecting them all to the same “experimental protocol”. Conclusions are drawn, and the research is continued as experimental, quasi-experimental or non-experimental. Cause can be extracted when the data is quantitative [74] and explanation [65] and correlation [75] if the data is qualitative.
Cause requires to define the factors that actually cause it with specific quantitative techniques to explain it (such as chi-square [76] in the cause of an independent variable); correlation defines a part of the cause and it does not mean that it is the cause itself, but that the independent variable studied is one of the causes [77]; and explanation implies to take qualitative data and expose it [78].
In our research, we are using a qualitative approach using parametrization with the support of quantitative data to correlate flipped classroom effects in the context of the change from face-to-face to online education without defining a final cause. Mixed assessment use both quantitative and qualitative methods to minimize the weaknesses of both methods and to enhance their strengths, obtaining very robust results when are properly combined.

3. Methodology

3.1. Goals, Research Design and Methods of the Study

As stated, the purpose of the study is to assess the adaptation of the FC model implemented in the course Concrete and Steel Structures, from face-to-face to online.
With that purpose, two research questions (RQ) have been raised. Data obtained from qualitative and quantitative surveys conducted at the end of the course will be analyzed according to these RQ. These surveys are conducted regularly in the subject since 2017, as part of a broader research [3,4] that pursues the continuous improvement of the students’ learning on structures. In this case, students’ answers were delivered online, due to the lockdown.

3.2. Participants

Students were divided in two groups due to the language in which classes were conducted, even the course development was identical for both. The international group took the subject in English with one professor, and the local group took the subject in Spanish and Catalan with another professor. The international group was formed by 18 students, while the local group had 50 students. 16 students of the international group answered the surveys (88.88%) while for the local group 32 students answered them (64%). The total number of students who answered the surveys was 48 and represent 70.59% of the number of total students that took the subject. For the purpose of this study, there will be no comparison between groups, and the results show the data obtained from the 48 participants.

3.3. Course Design

The course of Concrete and Steel Structures is a 9 ECTS (European Credit Transfer and Accumulation System [79,80] credit subject of 3rd course of Architecture undergraduate studies at La Salle Ramon Llull University in Barcelona [1]. There is an introductory subject on resistance of materials with 9 ECTS credits that is given in the 2nd course, called Introduction to Structures. There is also another subject completing the program of the studies in 4th course called Geotechnics and Foundations, with 3 ECTS credits.
The Concrete and Steel Structures course affected by the COVID-19 crisis started on 16 September 2019, with a face-to-face model of 50 sessions of 2 h, from 8 am to 10 am, on Mondays and Tuesdays, finishing on 18 May 2020. The course was divided in 4 parts: (a) Actions, 10 sessions, (b) Concrete structures part 1, 12 sessions, (c) Concrete structures part 2, 12 sessions and (d) Steel structures, 16 sessions.
A flipped classroom model was implemented for the whole course, following this cycle of 2 classes that was completed each week:
  • A brief introduction to the topic is explained at class;
  • Learning materials (pdf documents) are uploaded in the Moodle platform of the university for parts 1, 2 and 3. In the case of part 4, a video on the topic is uploaded;
  • Students read and study the notes at home and watch the video when available;
  • Next class starts solving the doubts of the students. That could take up to 50% of class time, depending on the students’ demand. After the doubt solving time, exercises on the topic were solved at class;
  • Next class starts also with doubt solving, with a limited time of 15 min. Then, a short exam of maximum 30 min is held. A brief introduction of the next topic is made the last hour of class, and the cycle is repeated from the point 2.
When lockdown was suddenly imposed on 14 March 2020, the subject suffered several changes to adapt to the online platform. The last face-to-face class was the 2nd session of the 4th part and was held on Tuesday 10 March 2020. The 1st online session was held on Monday 16 March 2020. The 1st short exam on part 4 was held in the class of 10 March, and a brief introduction to the 2nd topic on steel was done. At that time, La Salle University introduced the Blackboard Collaborate plug-in in the Moodle platform for online classes.
The main changes introduced in the subject were produced by the decision of not holding tests and exams online until the university had implemented a system for the exams, what happened to be ready after Easter. Therefore, only two short tests were made during lockdown time, joining the seven tests that were left. The rest of the course continued following the same cycle, with some little variations:
  • Videos were uploaded in the Moodle platform of the university, as well as the course notes and exercises on the topic;
  • Students watched the video and read and study the notes at home;
  • Next class started solving the doubts of the students. After the doubt solving time, exercises on the topic were solved at class;
  • Next class started also with doubt solving, with a limited time of 15 min. More exercises on the topic were solved at class. A brief introduction of the next topic was made the last half hour of class, and the cycle is repeated from the point 1.
Two classes were completely dedicated to do the joined short exams, following the general system of surveillance for the ordinary and extraordinary exams held in all the subjects of the university. The first one joined 5 of the previewed short exams in one, and the last one included two short exams. Thus, the 8 short exams previewed for part 4 were completed.

3.4. Instruments

A mixed approach is used to perform the study using both quantitative and qualitative surveys. Quantitative surveys conducted at the end of the course had several questions to be rated in a Likert scale from 1 to 5 related to the motivation and satisfaction of the students. The validity of the questionnaire was checked by five university doctors and experts in user-experience assessment belonging to the UserLab [81] at La Salle University, the first academic UserLab in the South of Europe.
Qualitative approach was also used for assessing the group satisfaction on the Flipped Classroom methodology, using the Bipolar Laddering Assessment, BLA [82]. BLA methodology is based in a Socratic Questionnaire with open questions to be answered in three steps and has demonstrated its validity in several studies with both qualitative approach and with mixed approach [83,84,85,86].
  • Firstly, students are requested to identify strengths and weaknesses, five of each type at most;
  • Secondly, they are asked to value these strengths and weaknesses between 1 and 10;
  • Finally, they are invited to propose how to improve what they have identified as strengths and weaknesses.
Students freely decide and identify which points they consider positive and negative. No suggestions or direct questions are stated by the survey conductor to prevent predetermining the answers. This kind of survey provides quality information about which aspects are relevant for the student with no interferences.
Socratic questionnaires [87,88] are used in qualitative studies. It is a method in which the user is invited to reflect about the experience through a dialogue with the guide that helps him in the identification of strengths and weaknesses, and that solves the possible doubts. In this method, the paper of the guide is very different than a classic interview, because it is not strategic and there is no direct intervention in influencing the responses.
Figure 1 shows the format of the BLA interview. It is a real sample with responses and marks from one student. As the questionnaire was sent by email due to the lockdown, instructions to complete the interview were given in the message.
The BLA interview was performed in May 2020 by videoconference due to the lockdown. The form was sent by email and uploaded in the Learning Management System minutes before being filled, and the following instructions were given to the students by the teacher using the videoconference platform:
The BLA form should be filled in four steps:
  • Write in the first 5 boxes the 5 aspects you most liked of this structures course.
  • Write in the next 5 boxes the 5 aspects you most disliked of this structures course.
  • Put a grade from 1 to 10 for each one of the aspects listed, in the box at the right of each one. For both strengths and weaknesses 1 means you completely dislike the aspect and 10 means you absolutely liked it.
  • In the next 10 boxes, write what you think it could be changed to improve each aspect listed. There are 10 boxes, one for each aspect previously stated in both positive and negative aspects.
These instructions were given step by step, so that the instructions for step 2 are given after students have completed step 1, and so forth until all steps are completed.
BLA interviews were filled by the students and returned to the teacher by email with no interference of the teacher that has the only role of guiding students to correctly fill the form, giving them full freedom to answer whatever they perceive as most relevant.

3.5. Mixed Data Analysis

By its nature, the data obtained is mainly analyzed in a qualitative approach, and the quantitative data is mainly used to support the validity of the conclusions obtained. The frequency of one answer in open questionnaires has been proved to be valid to answer research questions even with small samples [89,90]. The scientific validity of qualitative research is widely proved [91,92].
To analyze the answers of the qualitative assessment, 6 categories and 14 subcategories have been described. Thus, the answers of the students are classified in these subcategories, allowing comparison between categories and subcategories, and making possible to analyze the evolution of each category after each change. This categorization is also appropriate to classify the changes in the subject, and this way it is possible to relate the changes with the answers in the surveys. Each category has a code of one letter to refer to it, while for each subcategory the codes have two letters: the first one defines the category and the second one is given in alphabetic order.
The 6 categories are: Teachers (T), Documentation (D), Course approach (C), Class activities, dynamics and planning (P), Assessment (A) and Students (S).
The 14 subcategories are: Teachers (TA), Documents (DA), Videos (DB), Content of the course (CA), Course approach (CB), Time/content rate (CC), Class development (PA), System and methodology (PB), Schedule (PC), Technology use (PD), Assessment method (AA), Assessment development/organization (AB), Students’ motivation (SA) and Student’s profit (SB).
This classification is the one used in the aforementioned research [3,4] and it has been defined according to the different changes and aspects that wanted to be measured. For example, implementation of flipped classroom is categorized as PB, an improvement of the course notes is classified as DA and the introduction of tests of continuous assessment belongs to the AA category.
When categorizing the answers of the students, the code of the answers has three letters, adding a previous character to the category code: for strengths, the code starts with a S, for weaknesses with a W and for proposals, with a P. Therefore, if a student writes that the exams are short of time in the weaknesses section, this answer is classified as WAB, or if one student find that the videos are very helpful, that is categorized as SDB.
The categorization system used in this study for the qualitative assessment can easily be adapted to any other subject once the appropriate categories for each case are identified. According to the results, these categories can suffer little variations or can be divided in more subcategories to get most relevant and detailed information.
Students can also assess each strength and each weakness that they have written in the BLA survey with a grade from 1 to 10. As stated by Pifarré and Tomico [68], extreme marks have a strong emotional component. Positive extreme marks (9 or 10) mean that the student is thankful for that element, and negative extreme marks (1 or 2) mean that the student especially dislike that element.
Data obtained from quantitative surveys is treated statistically to allow further analysis and better understanding. For each question in the survey, the number of responses for each value on the Likert scale from 1 to 5 (n) is stated, as well as the mean (M), standard deviation (SD), skewness (Skw) and kurtosis (Kme). When required, the 95% confidence interval is also calculated (μ), as well as the upper (X1) and lower (X2) endpoints.
One implication of this research to be developed in future lines is the numeric management of qualitative data. Finding and checking proper ways to assign numeric values to quality data obtained in open questionnaires can lead to statistical analysis that can be scientifically treated to get valid conclusions. Even more when these data are transformed in a way that can be mixed with data obtained from quantitative methods.

4. Results

In this section, the results obtained from the qualitative and the quantitative survey conducted at the end of the course are shown, making special mention to the parameters related to the RQ stated.
Regarding the RQ1 (Has the FL model used in the Concrete and Steel Structures subject adapted properly from a face-to-face format to the online format?) the following parameters will provide relevant information:
  • Index of mention of the proper adaptation of the course to the online format in the strengths section;
  • Proportion of responses related with the FL in the strengths section;
  • Proportion of responses related with the FL in the weaknesses section;
  • Positive and negative extreme marks of the responses related to FL;
  • Lower endpoint value for 95% of confidence of the mark in the BLA for the categories related to FL;
  • Lower endpoint value for 95% of confidence of the responses in the quantitative survey for the categories related to FL.
Additionally, regarding the RQ2 (Do the students of the Concrete and Steel Structures subject negatively perceive the change to the online format?) parameters that provide relevant information will be:
  • Index of mention of the online classes in the weaknesses section;
  • Motivation in class activities and home activities expressed in qualitative surveys;
  • Motivation in online classes expressed in qualitative surveys;
  • Relationship between positive extreme marks and negative extreme marks of the responses related to classes in online format.

4.1. Qualitative Assessment

Table 1 and Table 2 show the common elements (those that are mentioned by two or more participants) of the strengths and weaknesses sections of the BLA survey.
Table 3 and Table 4 show the elements with extreme positive and negative marks written in the BLA survey by the students. Elements related to FL and to the online format of classes are indicated as they are used to set the value of the parameters defined at the beginning of Section 4 related to extreme marks. The element “Online classes development and adaptation” is related to both FL and online format, and the 2 extreme marks have been counted for both parameters.
In Table 5 and Table 6 all responses at the strengths and weaknesses sections are classified by subcategories. These tables are sorted according to the alphabetic order of the subcategory and given a color code for better comparison between the data in the tables.
The index of mention is the rate between the students that mention one subcategory and the total number of students (48), while the category weight is the rate between the number of mentions of the category and the total number of mentions.
For a better interpretation of results, BLA survey asks the students to value with a grade from 1 to 10 each one of the strengths and weaknesses, even though some students did not write any grade. These grades can be found in Table 7 grouped by subcategory. Blank values mean that no student valued with a grade any response of that subcategory.
Table 8 shows the average of grades for each subcategory considering in the same row all answers with grade for each one making no difference between strengths and weaknesses. Standard deviation is calculated to get the value of the 95% confidence interval μ, to get the lower and upper endpoint values for this degree of confidence.

4.2. Quantitative Assessment

Quantitative surveys to measure students’ motivation and satisfaction were also conducted after the course. For the purpose of this article, Table 9, Table 10 and Table 11 show the most significative results. The sample of the quantitative surveys is 47 students. In these tables the number of answers for each value in the Likert scale is shown (n is the number of answers and the value in brackets is the percentage of responses obtained for each value), as well as the mean (M), the standard deviation (SD), the skewness (Skw) and the Kurtosis (Kme). Each question is related to one of the subcategories, in a way that will let combining results and getting global values grouped by subcategories that are shown in Table 12.
Answers about satisfaction are gathered in Table 9, while the ones regarding motivation are shown in Table 10. Other closed questions included in the qualitative surveys related to some of the subcategories are exposed in Table 11. All questions should be answered in a Likert scale from 1 to 5, making possible to manage and group the data obtained from the different tables.
Table 12 shows the results shown in Table 9, Table 10 and Table 11 combined according to the related subcategory. Values are obtained from the average. The statistical parameters shown are calculated for n = 47 (the number of students that answered the survey) using the average grade of the answers of each student related to each category. Standard deviation is calculated to get the value of the 95% confidence interval μ to get the lower and upper endpoint values for this degree of confidence.

5. Discussion

Flipped learning (FL) is a blended learning methodology that combines face-to-face time in class with activities out of class, that in most cases are related to technology, like the Learning Management Systems (LMS) and the support of ad hoc videos, previous recordings or other documents that can be found online. FL use is blooming and there is recent extensive research on it, that underline benefits such as flexibility for the students, increased motivation, increased participation, improvement in interaction between students and between students and teachers, in student engagement, in professional skills and in academic results, among others.
Some of these benefits seem to be strongly related to face-to-face learning (e.g., interaction between students [93,94], interaction between student and teacher [95,96], and others (motivation, participation, engagement) could be linked to class activities with the support of the teacher (complex exercise solving, doubt solving, etc.). Improving the quality of face-to face sessions is a fundamental part of the FL system and the main reason to move the lecture out of the class time [5]. The COVID-19 crisis outbreak implied the removal of the face-to-face sessions of the course and led to a sudden adaptation of the model to completely online.
The aim of this study is to analyze this conversion between class formats to contribute to check the validity of FL in blended education, as well to confirm the benefits of FL related to face-to-face classes. In this context, two research questions have been stated:
RQ1
Has the FL model used in the Concrete and Steel Structures subject adapted properly from a face-to-face format to the online format?
RQ2
Do the students of the Concrete and Steel Structures subject negatively perceive the change to the online format?
To answer RQ1, and according to the results shown in the previous section, obtained from the data analysis, the value of the six relevant parameters listed in Section 4 is:
  • Index of mention (IM) of the proper adaptation of the course to the online format in the strengths section: 41.66%. It is the third most cited element, after videos (IM = 81.25%) and continuous assessment (IM = 70.83%);
  • Proportion of responses related with the FL in the strengths section: 65.01% (121 responses out of 186 total responses);
  • Proportion of responses related with the FL in the weaknesses section: 27.27% (27 out of 99 responses);
  • Positive and negative extreme marks of the responses related to FL: 25 extreme positive marks (58.14% of the 43 total positive extreme marks) and 2 extreme negative marks (22.22% of the 11 total negative extreme marks);
  • Lower endpoint value for 95% of confidence of the mark in the BLA for the categories related to FL. In a scale from 1 to 10, this value is 8.70 for category DB (videos), 6.01 for category PA (class development) and 5.08 for category PB (system and methodology);
  • Lower endpoint value for 95% of confidence of the responses in the quantitative survey for the categories related to FL. In a scale from 1 to 5, this value is 3.98 for category DB (videos), 3.70 for category PA (class development) and 3.53 for category PB (system and methodology).
In addition, an external relevant parameter regarding RQ1 can be added to this analysis. In the ordinary survey conducted at the end of the academic year in La Salle for 46 subjects of the Architecture undergraduate studies, the Concrete and Steel Structures subject is highly rated by the students when asked about the adaptation of the subject to the online learning format (4.55 over 5), reaching the fifth position, and the second place when considering only the subjects that have more than five responses.
Similarly, the value of each one of the four relevant parameters listed in Section 4 to answer RQ2 is shown below:
  • Index of mention of the online classes in the weaknesses section: 16.67%. It is the third most cited element, after the ratio content/time (IM = 29.16%) and time given for exams (IM = 25.00%);
  • Motivation in class activities and home activities expressed in qualitative surveys: in a scale from 1 to 5, average rating of motivation for class activities is 4.00 (listen to teachers’ explanations), 3.93 (do exercises at class) and 3.92 (solve doubts at class). The rating for home activities is 3.478 (study notes at home) and 3.614 (watch to videos at home);
  • Motivation in face-to-face and online classes expressed in qualitative surveys: in a scale from 1 to 5, average rating of motivation for attending face-to face classes is 3.85 while the value for attending online classes is 2.935;
  • Positive and negative extreme marks of the responses related to classes in online format: two extreme positive marks (4.65% of the 43 total positive extreme marks) and one extreme negative mark (11.11% of the 11 total negative extreme marks).
Although it is not related to the research questions stated, we think that it is important to underline that even though students do not like the exams (3.55 rating in satisfaction quantitative survey and 4.31 in qualitative survey), they answer that the most motivating activity of the course for them is being assessed continuously (4.04 over 5 in quantitative surveys and 7.96 over 10 in the average for the subcategory in qualitative surveys). Continuous assessment has been proven to be crucial for the involvement of the students in the FL methodology during the whole course [97] and can be determinant in the success or failure of FL implementation.

6. Conclusions

According to the results obtained, and regarding RQ1, it can be deducted from the six relevant parameters analyzed that the FL model used in the Concrete and Steel subject has properly adapted from a face-to-face format to the online format.
Therefore, this study reveals that, even the students rate lowly their motivation on the flipped classroom, the FL model designed for the Concrete and Steel Structures subject has been proven to work properly for both face-to-face and online classes and can be used for hybrid situations where the course needs to be developed mixing face-to-face and online classes, as it is currently happening due to the pandemic crisis.
Concerning RQ2, and according to the values of the four parameters used, it can be inferred that the students of the Concrete and Steel Structures subject perceive the change to the online format as negative.
Therefore, the research supports previous findings in FL research [7], related to the benefits linked to face-on-face classes [94], like increased motivation, participation and engagement [95]. One of the fundamentals of FL is to enhance the quality of the class time, where students can interact with other students and with the teacher, thanks to the reallocation of the lecture out of the class [5]. Online classes restrain these benefits, and even FL efficiency in online formats has been proven [98], results reveal that students who are getting the advantages of face-to-face classes in an FL model and are forced to move online show strong opposition to the online format.
This study has several known limitations. The experiment was only developed in a specific geographic and academic context, and only one course was considered. There is no control group to check statistically the validity of the findings exposed. The size of the sample, despite that it can be justified and supported by previous studies, can lead to unbalanced results. Therefore, the evidence must be treated with caution as it cannot be generalized to the world population as a whole.
Similar studies in other regions and academic contexts can help to amend these limitations and compare the results to find indicators to be used to verify the improvement of learning, as replication of the conditions of experiment is not possible because of its nature related to the outbreak of the COVID-19 crisis.

Author Contributions

C.C. structured the paper and wrote the Section 3, Section 4, Section 5 and Section 6. C.C., D.F., D.A. and N.M. equally contributed to the introduction and Section 2. This paper is based in the PhD thesis in process of C.C., directed by D.F. and N.M. D.F., D.F. and E.P. contributed to funding acquisition. All authors equally contributed to the supervision and reviewing of this paper. All authors have read and agreed to the published version of the manuscript.

Funding

The research that has led to these results has been carried out with funds from the “la Caixa” Foundation, Grants for helping the Research of Ramon Llull University, la Salle, under the grant: 2020-URL-IR2nQ-006.

Institutional Review Board Statement

The research presented, as well as the design, collection and management of its data, has been POSITIVE evaluated and APPROVED, by the Ethics Committee of the Ramon Llull University with the file number: CER URL_2020_2021_009.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data used in this study are available on request. Data are not public for privacy reasons.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. La Salle Academic Guide. Syllabus of the degree in Architecture Studies. Available online: https://www.salleurl.edu/en/education/degree-architecture-studies/syllabus (accessed on 20 January 2021).
  2. Ministerio de la Presidencia del gobierno de España. Real Decreto 463/2020, de 14 de Marzo, por el que se Declara el Estado de Alarma para la Gestión de la Situación de Crisis Sanitaria Ocasionada por el COVID-19; BOE, Ministerio de la Presidencia, relaciones con las Cortes y Memoria Democrática; Gobierno de España: Madrid, Spain, 2020; pp. 1–3. [Google Scholar]
  3. Campanyà, C.; Fonseca, D.; Martí, N.; Amo, D.; Simón, D. Assessing the Pilot Implementation of Flipped Classroom Methodology in the Concrete and Steel Structures Subject of Architecture Undergraduate Studies. In Innovative Trends in Flipped Teaching and Adaptive Learning; IGI Global: Hershey, PA, USA, 2019. [Google Scholar]
  4. Campanyà, C.; Fonseca, D.; Martí, N.; Peña, E.; Ferrer, A.; Llorca, J. Identification of Significant Variables for the Parameterization of Structures Learning in Architecture Students; Springer: Cham, Switzerland, 2018; Volume 747, ISBN 9783319776996. [Google Scholar]
  5. Bergmann, J.; Sams, A. Flip Your Classroom Reach Every Student in Every Class Every Day, 1st ed.; International Society for Technology in Education: Eugene, OR, USA, 2014. [Google Scholar]
  6. Zhang, D.; Zhou, L.; Briggs, R.O.; Nunamaker, J.F. Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Inf. Manag. 2006, 43, 15–27. [Google Scholar] [CrossRef]
  7. Karabulut-Ilgu, A.; Jaramillo Cherrez, N.; Jahren, C.T. A systematic review of research on the flipped learning method in engineering education. Br. J. Educ. Technol. 2018, 49, 398–411. [Google Scholar] [CrossRef]
  8. Pollio, V. The Ten Books on Architecture; Archit. Read. Essent. Writings from Vitr. to Present; Dover Publications: Mineola, NY, USA, 1960. [Google Scholar] [CrossRef]
  9. Alberti, L.B.; Núñez, J.F. De Re Aedificatoria; Akal: Madrid, Spain, 1991; p. 476. ISBN 84-7600-924-0. [Google Scholar]
  10. Viollet-le-Duc, E.-E. Entretiens sur l’architecture. In Del espacio arquitectónico. Ensayo de epistemología de la arquitectura; 1980; ISBN 2884741526. Available online: http://www.amazon.fr/review/create-review/ref=dp_db_cm_cr_acr_wr_link?ie=UTF8&asin=2884741526 (accessed on 27 January 2021).
  11. Architecture. Encyclopedia Britannica Architecture. Encyclopedia Britannica; Encyclopaedia Britannica, Inc.: Chicago, IL, USA. Available online: https://www.britannica.com/topic/architecture (accessed on 1 March 2021).
  12. Moore, C.W. Architecture: Art and Science. J. Archit. Educ. 1965, 19. [Google Scholar] [CrossRef]
  13. Statham, H.H. Vitruvius: The Ten Books of Architecture. Translated by Prof. Morris, H. Morgan. with illustrations prepared under the direction of Prof. H. Langford Warren, 10 × 6¾, xiii + 331 pp. 61 illustrations. Cambridge: Harvard University Press. London: H. Milford. Oxford: University Press, 1914. 15s. n. J. Rom. Stud. 2012, 4. [Google Scholar] [CrossRef]
  14. Patterson, R. What vitruvius said. J. Archit. 1997, 2, 355–373. [Google Scholar] [CrossRef]
  15. Blondel, J.-F. Cours d’architecture, ou Traité de la Décoration, Distribution et Construction des Bâtiments: Contenant les Leçons Données en 1750 et les Années Suivantes; General Books LLC: Memphis, TE, USA, 2012; Volume 1, p. 118. ISBN 978-1235227318. [Google Scholar]
  16. Toca Fernández, A. El Tejido de la Arquitectura; Editorial Universidad de Sevilla: Seville, Spain, 2015. [Google Scholar]
  17. Precis Des Lecons D’Architecture Donnees A L’Ecole Polytechnique French Edition by Durand, Jean-Nicolas-Louis, Durand-J-N-L 2013 Paperback: Amazon.es: Durand, Jean-Nicolas-Louis, Durand-J-N-L: Libros. Available online: https://www.amazon.es/DArchitecture-Polytechnique-Jean-Nicolas-Louis-Durand-J-N-L-Paperback/dp/B00YZLHCTW (accessed on 1 April 2021).
  18. The Making of Mechanical Engineers in France: The Ecoles d’Arts et Métiers, 1803–1914. Available online: https://search.proquest.com/openview/d834a795b9699b871dfec86a389c47ab/1.pdf?pq-origsite=gscholar&cbl=1819160 (accessed on 1 April 2021).
  19. Mejores Programas de Grado en Arquitectura en Europa 2021. Available online: https://www.licenciaturaspregrados.com/Grado/Arquitectura/Europa/ (accessed on 1 April 2021).
  20. Jefatura de Estado. Ley de Ordenación de la Edificación; Spain. 1999. Available online: https://www.boe.es/buscar/act.php?id=BOE-A-1999-21567 (accessed on 1 March 2021).
  21. European Parliament. European Parliament. Directive 2005/36/EC of the European Parliament and of the Council. Off. J. Eur. Union 2005, 255, 121. [Google Scholar]
  22. Ministerio de Educación. Orden EDU/2075/2010 Acuerdo de Consejo de Ministros por el que se Establecen las Condiciones a las que Deberán Adecuarse los Planes de Estudios Conducentes a la Obtención de Títulos que Habiliten para el Ejercicio de la Profesión Regulada de Arquitecto; BOE, 185; Ministerio de Educación: Madrid, Spain, 2010. [Google Scholar]
  23. Martí, N.; Fonseca, D.; Peña, E.; Adroer, M.; Simón, D. Design of interactive and collaborative learning units using TICs in architectural construction education. Rev. Construcción 2017. [Google Scholar] [CrossRef] [Green Version]
  24. ANECA. Libro Blanco. Título de Grado en Arquitectura; Agencia Nacional de Evaluación de la Calidad y Acreditación: Madrid, Spain, 2005. [Google Scholar]
  25. Bonwell, C.C.; Eison, J.A. Active Learning: Creating Excitement in the Classroom; ASHE-ERIC Higher Education Report No.1; School of Education and Human Development, The George Washington University: Washington, DC, USA, 1991; p. 121. ISSN 0884-0040. ISBN 1-878380-08-7. [Google Scholar]
  26. Nguyen, K.A.; Borrego, M.; Finelli, C.J.; DeMonbrun, M.; Crockett, C.; Tharayil, S.; Shekhar, P.; Waters, C.; Rosenberg, R. Instructor strategies to aid implementation of active learning: A systematic literature review. Int. J. STEM Educ. 2021, 8, 1–18. [Google Scholar] [CrossRef]
  27. Prince, M. Does Active Learning Work? A Review of the Research. J. Eng. Educ. 2004, 93, 223–231. [Google Scholar] [CrossRef]
  28. Freeman, S.; Eddy, S.L.; McDonough, M.; Smith, M.K.; Okoroafor, N.; Jordt, H.; Wenderoth, M.P. Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. USA 2014, 111, 8410–8415. [Google Scholar] [CrossRef] [Green Version]
  29. Theobald, E.J.; Hill, M.J.; Tran, E.; Agrawal, S.; Nicole Arroyo, E.; Behling, S.; Chambwe, N.; Cintrón, D.L.; Cooper, J.D.; Dunster, G.; et al. Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proc. Natl. Acad. Sci. USA 2020, 117, 6476–6483. [Google Scholar] [CrossRef] [Green Version]
  30. Stains, M.; Harshman, J.; Barker, M.K.; Chasteen, S.V.; Cole, R.; DeChenne-Peters, S.E.; Eagan, M.K.; Esson, J.M.; Knight, J.K.; Laski, F.A.; et al. Anatomy of STEM teaching in North American universities. Science 2018, 359, 1468–1470. [Google Scholar] [CrossRef] [PubMed]
  31. Hora, M.T.; Ferrare, J.J. Instructional Systems of Practice: A Multidimensional Analysis of Math and Science Undergraduate Course Planning and Classroom Teaching. J. Learn. Sci. 2013, 22, 212–257. [Google Scholar] [CrossRef]
  32. Kuhn, S. Learning from the Architecture Studio: Implications for Project-Based Pedagogy. Int. J. Eng. Educ. 2001, 17, 249–352. [Google Scholar]
  33. Condit, C.W.; Drexler, A. The Architecture of the Ecole des Beaux-Arts. Technol. Cult. 1978, 19. [Google Scholar] [CrossRef]
  34. Dutton, T.A. Design and studio pedagogy. J. Archit. Educ. 1987, 41. [Google Scholar] [CrossRef]
  35. De Graaff, E.; Cowdroy, R. Theory and practice of educational innovation through introduction of problem-based learning in architecture. Int. J. Eng. Educ. 1997, 13, 166–174. [Google Scholar]
  36. Bishop, J.L.; Verleger, M.A. The flipped classroom: A survey of the research. In Proceedings of the ASEE Annual Conference and Exposition, Conference Proceedings, Atlanta, GA, USA, 23–26 June 2013; ASEE: Atlanta, GA, USA, 2013. [Google Scholar]
  37. Bart, M. Survey Confirms Growth of the Flipped Classroom. 2014. Available online: https://www.facultyfocus.com/articles/blended-flipped-learning/survey-confirms-growth-of-the-flipped-classroom/ (accessed on 20 January 2021).
  38. Pozo Sánchez, S.; López-Belmonte, J.; Moreno-Guerrero, A.-J.; Sola Reche, J.M.; Fuentes Cabrera, A. Effect of Bring-Your-Own-Device Program on Flipped Learning in Higher Education Students. Sustainability 2020, 12, 3729. [Google Scholar] [CrossRef]
  39. Awidi, I.T.; Paynter, M. The impact of a flipped classroom approach on student learning experience. Comput. Educ. 2019, 128, 269–283. [Google Scholar] [CrossRef]
  40. Fernandez, A.R.; Merino, P.J.M.; Kloos, C.D. Scenarios for the application of learning analytics and the flipped classroom. In Proceedings of the IEEE Global Engineering Education Conference, Educon, Santa Cruz de Tenerife, Spain, 17–20 April 2018; IEEE: Piscataway, NJ, USA, 2018. [Google Scholar]
  41. López Núñez, J.A.; López-Belmonte, J.; Moreno-Guerrero, A.-J.; Marín-Marín, J.A. Dietary Intervention through Flipped Learning as a Techno Pedagogy for the Promotion of Healthy Eating in Secondary Education. Int. J. Environ. Res. Public Health 2020, 17, 3007. [Google Scholar] [CrossRef]
  42. García-Peñalvo, F.J.; Fidalgo-Blanco, Á.; Sein-Echaluce, M.L.; Conde, M.Á. Cooperative micro flip teaching. In Proceedings of the International Conference on Learning and Collaboration Technologies, Toronto, ON, Canada, July 17–22 2016. [Google Scholar]
  43. Fidalgo-Blanco, Á.; Sein-Echaluce, M.L.; García-Peñalvo, F.J. Ontological Flip Teaching: A Flip Teaching model based on knowledge management. Univers. Access Inf. Soc. 2018. [Google Scholar] [CrossRef]
  44. Sein-Echaluce, M.L.; Fidalgo Blanco, Á.; García Peñalvo, F.J.; García Peñalvo, F.J.; García Peñalvo, F.J. Trabajo en equipo y Flip Teaching para mejorar el aprendizaje activo del alumnado—[Peer to Peer Flip Teaching]. In Proceedings of the La innovación docente como misión del profesorado: Congreso Internacional Sobre Aprendizaje, Innovación y Competitividad, Zaragoza, Spain, 4–6 October 2017; Servicio de Publicaciones Universidad: Zaragoza, Spain, 2017; pp. 1–6. [Google Scholar]
  45. Fidalgo-Blanco, A.; Martinez-Nuñez, M.; Borrás-Gene, O.; Sanchez-Medina, J.J. Micro flip teaching—An innovative model to promote the active involvement of students. Comput. Human Behav. 2017. [Google Scholar] [CrossRef]
  46. Huang, B.; Hew, K.F.; Lo, C.K. Investigating the effects of gamification-enhanced flipped learning on undergraduate students’ behavioral and cognitive engagement. Interact. Learn. Environ. 2019, 27, 1106–1126. [Google Scholar] [CrossRef]
  47. Pons, O.; Franquesa, J.; Hosseini, S.M.A. Integrated Value Model to assess the sustainability of active learning activities and strategies in architecture lectures for large groups. Sustainability 2019, 11, 2917. [Google Scholar] [CrossRef] [Green Version]
  48. Sergis, S.; Sampson, D.G.; Pelliccione, L. Investigating the impact of Flipped Classroom on students’ learning experiences: A Self-Determination Theory approach. Comput. Human Behav. 2018. [Google Scholar] [CrossRef]
  49. Fonseca, D.; Conde, M.Á.; García-Peñalvo, F.J. Improving the information society skills: Is knowledge accessible for all? Univers. Access Inf. Soc. 2018. [Google Scholar] [CrossRef]
  50. Olmos Migueláñez, S. Evaluación formativa y sumativa de estudiantes universitarios: Aplicación de las tecnologías a la evaluación educativa. Teoría Educ. Educ. Cult. Soc. Inf. 2009, 10, 305–307. [Google Scholar] [CrossRef]
  51. Williamson, B. Big Data in Education: The Digital Future of Learning, Policy and Practice; SAGE Publications Ltd.: Southend Oaks, CA, USA, 2017. [Google Scholar]
  52. Enhancing Teaching and Learning Through Educational Data Mining and Learning Analytics: An Issue Brief. Available online: https://mobile.eduq.info/xmlui/handle/11515/35829 (accessed on 1 March 2021).
  53. Analytics for Learning and Teaching Harmelen—Google Acadèmic. Available online: https://scholar.google.com/scholar?hl=ca&as_sdt=0%2C5&q=Analytics+for+learnig+and+teaching+harmelen&btnG= (accessed on 1 March 2021).
  54. Chatti, M.A.; Dyckhoff, A.L.; Schroeder, U.; Thüs, H. A reference model for learning analytics. Int. J. Technol. Enhanc. Learn. 2012, 4, 318–331. [Google Scholar] [CrossRef]
  55. Siemens, G. Learning analytics: Envisioning a research discipline and a domain of practice. In Proceedings of the ACM International Conference Proceeding Series; ACM Press: New York, NY, USA, 2012; pp. 4–8. Available online: https://dl.acm.org/doi/10.1145/2330601.2330605 (accessed on 1 March 2021).
  56. Clow, D. The learning analytics cycle: Closing the loop effectively. In Proceedings of the ACM International Conference Proceeding Series; ACM Press: New York, NY, USA, 2012; pp. 134–138. Available online: http://oro.open.ac.uk/34330/ (accessed on 1 March 2021).
  57. Simon, D.; Fonseca, D.; Necchi, S.; Vanesa-Sánchez, M.; Campanyà, C. Architecture and Building Enginnering Educational Data Mining. Learning Analytics for detecting academic dropout. In Proceedings of the Iberian Conference on Information Systems and Technologies, CISTI, Coimbra, Portugal, 19–22 June 2019. [Google Scholar]
  58. Prenger, R.; Schildkamp, K. Data-based decision making for teacher and student learning: A psychological perspective on the role of the teacher. Educ. Psychol. 2018, 38, 734–752. [Google Scholar] [CrossRef] [Green Version]
  59. Cohen, L.; Lawrence, M.; Morrison, K. Research Methods in Education, 8th ed.; T&F INDIA: London, UK, 2017. [Google Scholar]
  60. McKim, C.A. The Value of Mixed Methods Research: A Mixed Methods Study. J. Mix. Methods Res. 2017, 11, 202–222. [Google Scholar] [CrossRef]
  61. Jamali, H.R. Does research using qualitative methods (grounded theory, ethnography, and phenomenology) have more impact? Libr. Inf. Sci. Res. 2018, 40, 201–207. [Google Scholar] [CrossRef]
  62. Joshi, A.; Kale, S.; Chandel, S.; Pal, D. Likert Scale: Explored and Explained. Br. J. Appl. Sci. Technol. 2015, 7, 396–403. [Google Scholar] [CrossRef]
  63. Holland, P.W. Statistics and causal inference. J. Am. Stat. Assoc. 1986, 81, 945–960. [Google Scholar] [CrossRef]
  64. Polkinghorne, D.E. Narrative configuration in qualitative analysis. Int. J. Qual. Stud. Educ. 1995, 8. [Google Scholar] [CrossRef]
  65. Næss, P. Validating explanatory qualitative research: Enhancing the interpretation of interviews in urban planning and transportation research. Appl. Mobilities 2020, 5, 186–205. [Google Scholar] [CrossRef]
  66. Ardianti, S.; Sulisworo, D.; Pramudya, Y.; Raharjo, W. The impact of the use of STEM education approach on the blended learning to improve student’s critical thinking skills. Univers. J. Educ. Res. 2020, 8. [Google Scholar] [CrossRef]
  67. Kizkapan, O.; Bektas, O. The effect of project based learning on seventh grade students’ academic achievement. Int. J. Instr. 2017, 10. [Google Scholar] [CrossRef]
  68. Starks, H.; Trinidad, S.B. Choose your method: A comparison of phenomenology, discourse analysis, and grounded theory. Qual. Health Res. 2007, 17. [Google Scholar] [CrossRef]
  69. Campbell, R.; Pound, P.; Pope, C.; Britten, N.; Pill, R.; Morgan, M.; Donovan, J. Evaluating meta-ethnography: A synthesis of qualitative research on lay experiences of diabetes and diabetes care. Soc. Sci. Med. 2003, 56. [Google Scholar] [CrossRef]
  70. Bradley, E.H.; Curry, L.A.; Devers, K.J. Qualitative data analysis for health services research: Developing taxonomy, themes, and theory. Health Serv. Res. 2007, 42. [Google Scholar] [CrossRef] [Green Version]
  71. Jones, P.W.; Harding, G.; Berry, P.; Wiklund, I.; Chen, W.H.; Kline Leidy, N. Development and first validation of the COPD Assessment Test. Eur. Respir. J. 2009, 34. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  72. Mays, N.; Pope, C. Qualitative Research: Observational methods in health care settings. BMJ 1995, 311. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  73. Sampson, H. Navigating the waves: The usefulness of a pilot in qualitative research. Qual. Res. 2004, 4. [Google Scholar] [CrossRef]
  74. Angrist, J.D.; Imbens, G.W.; Rubin, D.B. Identification of Causal Effects Using Instrumental Variables. J. Am. Stat. Assoc. 1996, 91, 444–455. [Google Scholar] [CrossRef]
  75. Rauch, A.; Wiklund, J.; Lumpkin, G.T.; Frese, M. Entrepreneurial orientation and business performance: An assessment of past research and suggestions for the future. Entrep. Theory Pract. 2009, 33. [Google Scholar] [CrossRef]
  76. Justice, A.C.; Holmes, W.; Gifford, A.L.; Rabeneck, L.; Zackin, R.; Sinclair, G.; Weissman, S.; Neidig, J.; Marcus, C.; Chesney, M.; et al. Development and validation of a self-completed HIV symptom index. J. Clin. Epidemiol. 2001, 54 (Suppl. 1), S77–S90. [Google Scholar] [CrossRef]
  77. Smith, G.D.; Phillips, A.N. Correlation without a cause: An epidemiological odyssey. Int. J. Epidemiol. 2020, 49, 4–14. [Google Scholar] [CrossRef]
  78. Maxwell, J.A. Causal Explanation, Qualitative Research, and Scientific Inquiry in Education. Educ. Res. 2004, 33. [Google Scholar] [CrossRef] [Green Version]
  79. European Credit Transfer and Accumulation System (ECTS). Education and Training. Available online: https://ec.europa.eu/education/node/90_en (accessed on 31 March 2021).
  80. ECTS Users’ Guide 2015—Publications Office of the EU. Available online: https://op.europa.eu/en/publication-detail/-/publication/da7467e6-8450-11e5-b8b7-01aa75ed71a1 (accessed on 31 March 2021).
  81. UserLab|La Salle|Campus Barcelona. Available online: https://www.salleurl.edu/en/userlab (accessed on 1 April 2021).
  82. Pifarré, M.; Tomico, O. Bipolar laddering (BLA): A participatory subjective exploration method on user experience. In Proceedings of the Proceedings of the 2007 Conference on Designing for User eXperiences, DUX’07, Chicago, IL, USA, 5–7 November 2017; ACM Press: New York, NY, USA, 2007; p. 2. [Google Scholar]
  83. Fonseca, D.; Redondo, E.; Valls, F.; Villagrasa, S. Technological adaptation of the student to the educational density of the course. A case study: 3D architectural visualization. Comput. Human Behav. 2017. [Google Scholar] [CrossRef]
  84. Sanchez-Sepulveda, M.; Fonseca, D.; Franquesa, J.; Redondo, E. Virtual interactive innovations applied for digital urban transformations. Mixed approach. Futur. Gener. Comput. Syst. 2019. [Google Scholar] [CrossRef]
  85. Fonseca, D.; Redondo, E.; Villagrasa, S. Mixed-methods research: A new approach to evaluating the motivation and satisfaction of university students using advanced visual technologies. Univers. Access Inf. Soc. 2015. [Google Scholar] [CrossRef] [Green Version]
  86. Fonseca, D.; Valls, F.; Redondo, E.; Villagrasa, S. Informal interactions in 3D education: Citizenship participation and assessment of virtual urban proposals. Comput. Human Behav. 2016. [Google Scholar] [CrossRef] [Green Version]
  87. The Thinker’s Guide to Socratic Questioning—Richard Paul, Linda Elder—Google Llibres. Available online: https://books.google.es/books?hl=ca&lr=&id=ADWbDwAAQBAJ&oi=fnd&pg=PA2&dq=socratic+questioning&ots=1jEZAVwM37&sig=K8apWCQOY7ZOopCPV700LHsqkiA&redir_esc=y#v=onepage&q=socraticquestioning&f=false (accessed on 1 April 2021).
  88. Turkcapar, M.; Kahraman, M.; Sargin, A. Guided Discovery with Socratic Questioning. J. Cogn. Psychother. Res. 2015, 4, 47. [Google Scholar] [CrossRef]
  89. Dworkin, S.L. Sample size policy for qualitative studies using in-depth interviews. Arch. Sex. Behav. 2012, 41, 1319–1320. [Google Scholar] [CrossRef] [Green Version]
  90. Vasileiou, K.; Barnett, J.; Thorpe, S.; Young, T. Characterising and justifying sample size sufficiency in interview-based studies: Systematic analysis of qualitative health research over a 15-year period. BMC Med. Res. Methodol. 2018, 18, 148. [Google Scholar] [CrossRef] [Green Version]
  91. Kozleski, E.B. The Uses of Qualitative Research: Powerful Methods to Inform Evidence-Based Practice in Education. Res. Pract. Pers. Sev. Disabil. 2017, 42, 19–32. [Google Scholar] [CrossRef]
  92. Leydens, J.A.; Moskal, B.M.; Pavelich, M.J. Qualitative methods used in the assessment of engineering education. J. Eng. Educ. 2004, 93, 65–72. [Google Scholar] [CrossRef]
  93. Bailey, R.; Smith, M.C. Implementation and assessment of a blended learning environment as an approach to better engage students in a large systems design class. In Proceedings of the ASEE Annual Conference and Exposition, Atlanta, GA, USA, 23–26 June 2013. [Google Scholar]
  94. Ghadiri, K.; Qayoumi, M.H.; Junn, E.; Hsu, P.; Sujitparapitaya, S. Developing and implementing effective instructional stratgems in STEM. In Proceedings of the ASEE Annual Conference and Exposition, Indianapolis, Indiana, 15–18 June 2014. [Google Scholar]
  95. Mok, H.N. Teaching tip: The flipped classroom. J. Inf. Syst. Educ. 2014, 25, 7–11. [Google Scholar]
  96. Clark, R.M.; Norman, B.A.; Besterfield-Sacre, M. Preliminary experiences with “Flipping” a facility layout/material handling course. In Proceedings of the IIE Annual Conference and Expo 2014, Montreal, QC, Canada, 31 May–3 June 2014. [Google Scholar]
  97. Holmes, N. Engaging with assessment: Increasing student engagement through continuous assessment. Act. Learn. High. Educ. 2018, 19, 23–34. [Google Scholar] [CrossRef] [Green Version]
  98. Shih, W.L.; Tsai, C.Y. Students’ perception of a flipped classroom approach to facilitating online project-based learning in marketing research courses. Australas. J. Educ. Technol. 2017, 33. [Google Scholar] [CrossRef] [Green Version]
Figure 1. BLA interview form.
Figure 1. BLA interview form.
Sustainability 13 05826 g001
Table 1. Common elements in the strengths section mentioned by the students in the BLA survey (1 Elements related to FL).
Table 1. Common elements in the strengths section mentioned by the students in the BLA survey (1 Elements related to FL).
ElementCategoryMentions
Videos 1SDB39
Continuous assessmentSAA34
Online classes development and adaptation 1SPA20
Guidance/proximity/accessibility/care of the teacher 1STA17
Documentation uploaded in Moodle 1SDA12
Division of topics 1SPB9
Explanations of the teacherSTA7
Exercises solved in class 1SPA5
Subject organization 1SPB5
No courseworkSPB3
Class activities/dynamics 1SPA3
Formulae given in examsSAB2
Students’ participation in online classes 1SSA2
Adequate workload 1SPB2
Combination of theory and practice 1SPB2
Real cases used as exampleSPB2
Good schedule (requires to be active)SPC2
Applicability of the subject to real casesSCA2
Flipped class 1SPB2
Quality of teachersSTA2
Table 2. Common elements in the weaknesses section mentioned by the students in the BLA survey (1 Elements related to FL).
Table 2. Common elements in the weaknesses section mentioned by the students in the BLA survey (1 Elements related to FL).
ElementCategoryMentions
Too much content to be assimilated in such a short timeWCC14
Exams are too long/short of time/stressfulWAB12
Online classesWPA8
Classes schedule at 8:00 amWPC7
Lack of time for doubts in class 1WPA6
Too much theory/too few practice 1WPB5
Exams too close in timeWAB5
More topics should be included in the syllabusWCA5
Rhythm is too high 1WPA3
Lack of coordination between groupsWPA2
Online classes softwareWPD2
Lack of continuous assessment/Grade depends too much on the final examWAA2
Some material is only posted in English and Catalan 1WDA2
No questions allowed in examsWAB2
Student feels difficult to ask at class 1WTA2
Incomplete notes 1WDA2
Organization and continuity of the topics 1WPB2
Table 3. Number of positive extreme marks (9 or 10).
Table 3. Number of positive extreme marks (9 or 10).
ElementCategoryMarks ≥ 9
Videos 1SDB11
Continuous assessmentSAA6
Guidance/proximity/accessibility/care of the teacher 1STA6
Explanations of the teacherSTA4
Class activities/dynamics 1SPA2
Online classes development and adaptation 1,2SPA2
Real cases used as exampleSPB2
Attendance countsSAA1
Formulae given in examsSAB1
Applicability of the subject to real casesSCA1
Documentation uploaded in Moodle 1SDA1
Good schedule (requires to be active)SPC1
Division of topics 1SPB1
Exercises solved in class 1SPA1
Subject organization 1SPB1
Use of 3 languagesSPA1
Teacher concern to adapt to COVID crisisSTA1
1 Elements related to FL 2 Elements related to the online format of classes.
Table 4. Number of negative extreme marks (1 or 2).
Table 4. Number of negative extreme marks (1 or 2).
ElementCategoryMarks ≤ 2
Classes schedule at 8:00 amWPC2
ExamsWAA1
Lack of continuous assessment/Grade depends too much on the final examWAA1
Exams are too long/short of time/stressfulWAB1
No questions allowed in examsWAB1
Too much theory/too few practice 1WPB1
Organization and continuity of the topics 1WPB1
Online classes 2WPA1
1 Elements related to FL; 2 Elements related to online format of classes.
Table 5. Strengths by subcategory mentioned by the students in the BLA survey.
Table 5. Strengths by subcategory mentioned by the students in the BLA survey.
SubcategoryDescriptionMentionsIndex of MentionCategory Weight
SAAAssessment method3572.92%19.35%
SABAssessment development/organization36.25%1.61%
SCAContent of the course48.33%2.15%
SCBCourse approach00.00%0.00%
SCCTime/content rate00.00%0.00%
SDADocuments1327.08%6.99%
SDBVideos3981.25%20.97%
SPAClass development2756.25%17.20%
SPBSystem and methodology2143.75%13.44%
SPCSchedule24.17%1.08%
SPDTechnology use12.08%0.54%
SSAStudents’ motivation24.17%1.08%
SSBStudents’ profit12.08%0.54%
STATeachers2347.92%15.05%
Table 6. Weaknesses by subcategory mentioned by the students in the BLA survey.
Table 6. Weaknesses by subcategory mentioned by the students in the BLA survey.
SubcategoryDescriptionMentionsIndex of MentionCategory Weight
WAAAssessment method34.17%3.03%
WABAssessment development/organization2137.50%21.21%
WCAContent of the course612.50%6.06%
WCBCourse approach00.00%0.00%
WCCTime/content rate1429.17%14.14%
WDADocuments510.42%5.05%
WDBVideos24.17%2.02%
WPAClass development2341.67%23.23%
WPBSystem and methodology1020.83%10.10%
WPCSchedule816.67%8.08%
WPDTechnology use36.25%3.03%
WSAStudents’ motivation12.08%1.01%
WSBStudents’ profit00.00%0.00%
WTATeachers34.17%3.03%
Table 7. Average rating and SD for both strengths and weaknesses grouped by subcategory.
Table 7. Average rating and SD for both strengths and weaknesses grouped by subcategory.
StrengthsWeaknesses
SubcategoryDescriptionMSDMSD
AAAssessment method8.8751.1022.5000.500
ABAssessment development/organization8.5000.5003.9172.050
CAContent of the course9.0000.0005.5000.500
CBCourse approach----
CCTime/content rate--4.5000.957
DADocuments8.2000.9803.0000.000
DBVideos9.2671.123--
PAClass development9.1250.7814.2501.920
PBSystem and methodology9.3330.9434.0002.098
PCSchedule9.0000.0002.6671.700
PDTechnology use--4.0001.000
SAStudents’ motivation6.0000.000--
SBStudents’ profit----
TATeachers9.5560.6435.0000.000
Table 8. Average rating, standard deviation and 95% confidence interval when combining strengths and weaknesses rates of the qualitative survey grouped by subcategory.
Table 8. Average rating, standard deviation and 95% confidence interval when combining strengths and weaknesses rates of the qualitative survey grouped by subcategory.
SubcategoryDescriptionnMSDμX1X2
AAAssessment method147.9642.4601.2896.689.25
ABAssessment development/organization85.0632.6741.8533.216.92
CAContent of the course36.6671.7001.9234.748.59
CBCourse approach------
CCTime/content rate64.5000.9570.7663.735.27
DADocuments67.3332.1341.7085.639.04
DBVideos159.2671.1230.5698.709.84
PAClass development127.5002.6301.4886.018.99
PBSystem and methodology116.9093.0881.8255.088.73
PCSchedule44.2503.1123.0501.207.30
PDTechnology use24.0001.0001.3862.615.39
SAStudents’ motivation16.0000.0000.0006.006.00
SBStudents’ profit------
TATeachers109.1001.4970.9288.1710.03
Table 9. Answers regarding satisfaction respect several aspects of the course (n = 47).
Table 9. Answers regarding satisfaction respect several aspects of the course (n = 47).
Aspect and Subcategory RelatedLikert Scale 1 n (%)Parameters
12345MSDSkwKme
Continuous assessmentAA0 (0.0)2 (4.3)4 (8.5)16 (34.0)25 (53.2)4.3620.810−1.2281.221
ExamsAB2 (4.3)6 (12.8)9 (19.1)24 (51.1)6 (12.8)3.5531.007−0.7720.215
Course approachCB0 (0.0)1 (2.1)8 (17.0)28 (59.6)10 (21.3)4.0000.684−0.3990.476
Time/content rateCC4 (8.5)11 (23.4)17 (36.2)11 (23.4)4 (8.5)3.0001.0720.000−0.510
Course documentsDA0 (0.0)0 (0.0)5 (10.6)20 (42.6)22 (46.8)4.3620.666−0.565−0.652
Content of the videosDB1 (2.1)0 (0.0)8 (17.0)18 (38.3)20 (42.6)4.1910.866−1.1652.203
Organization of the videosDB1 (2.2)1 (2.2)6 (13.0)18 (39.1)20 (43.5)4.1960.900−1.2892.284
Quality of the videosDB1 (2.1)3 (6.4)4 (8.5)15 (31.9)24 (51.1)4.2340.994−1.3911.699
Usefulness of the videosDB1 (2.1)1 (2.1)8 (17.0)10 (21.3)27 (57.4)4.2980.966−1.3331.582
Online classesPA4 (8.9)3 (6.7)9 (20.0)14 (31.1)15 (33.3)3.7331.236−0.824−0.109
Exercises done in classPA0 (0.0)3 (6.4)15 (31.9)14 (29.8)15 (31.9)3.8720.937−0.210−1.054
Doubt solvingPA0 (0.0)3 (6.4)4 (8.5)24 (51.1)16 (34.0)4.1280.815−0.9450.943
Theory and practice integrationPB/CA0 (0.0)3 (6.4)14 (29.8)18 (38.3)12 (25.5)3.8300.883−0.218−0.755
Flipped classPB1 (2.4)1 (2.4)18 (42.9)14 (33.3)8 (19.0)3.6430.895−0.2340.395
Teachers’ explanationsTA1 (2.1)0 (0.0)6 (12.8)20 (42.6)20 (42.6)4.2340.831−1.3503.287
Teacher-student interactionTA0 (0.0)0 (0.0)7 (14.9)14 (29.8)26 (55.3)4.4040.734−0.797−0.682
1 Likert Scale from 1 = “Not at all satisfied” to 5 = “Extremely satisfied”.
Table 10. Answers regarding students’ motivation for different activities (n = 47). Questions are made in the format “The activity motivates me”.
Table 10. Answers regarding students’ motivation for different activities (n = 47). Questions are made in the format “The activity motivates me”.
Activity and Subcategory RelatedLikert Scale 1 n (%)Parameters
12345MSDSkwKme
Attend to face-to-face classesSA3 (6.52)4 (8.70)10 (21.74)9 (19.57)20 (43.48)3.8481.251−0.778−0.385
Listen to teachers’ explanationsSA3 (6.52)0 (0.00)10 (21.74)14 (30.43)19 (41.30)4.0001.103−1.1651.249
Do exercises at classSA1 (2.22)1 (2.22)12 (26.67)17 (37.78)14 (31.11)3.9330.929−0.7000.644
Solve doubts at classSA2 (4.26)1 (2.13)14 (29.79)12 (25.53)18 (38.30)3.9151.069−0.7720.300
Study notes at homeSA4 (8.70)2 (4.35)18 (39.13)12 (26.09)10 (21.74)3.4781.137−0.479−0.073
Watch to videos at homeSA4 (9.09)3 (6.82)12 (27.27)12 (27.27)13 (29.55)3.6141.229−0.630−0.342
Continuous assessmentSA2 (4.35)0 (0.00)12 (26.09)12 (26.09)20 (43.48)4.0431.042−1.0100.970
Attend to online classesSA9 (19.57)9 (19.57)10 (21.74)12 (26.09)6 (13.04)2.9351.325−0.048−1.182
1 Likert Scale from 1 = “Strongly disagree” to 5 = “Strongly agree”.
Table 11. Other questions in the survey related to subcategories (n = 47).
Table 11. Other questions in the survey related to subcategories (n = 47).
Question and Subcategory RelatedLikert Scale 1 n (%)Parameters
12345MSDSkwKme
Structural knowledge is useful for architectsCA0 (0.0)1 (2.2)4 (8.7)20 (43.5)21 (45.7)4.3260.724−0.9270.877
I will use technical knowledge in my professional lifeCA/SA4 (8.5)5 (10.6)11 (23.4)14 (29.8)13 (27.7)3.5741.233−0.580−0.521
The course has met my expectationsSB0 (0.0)2 (4.3)9 (19.1)23 (48.9)13 (27.7)4.0000.799−0.501−0.061
The course has met my needsSB1 (2.2)4 (8.7)13 (28.3)16 (34.8)12 (26.1)3.7391.009−0.476−0.234
I learned a lot with this structural courseSB2 (4.3)0 (0.0)8 (17.0)20 (42.6)17 (36.2)4.0640.954−1.3022.501
1 Likert Scale from 1 = “Strongly disagree” to 5 = “Strongly agree”.
Table 12. Statistical analysis per subcategory obtained from quantitative assessment (n = 47).
Table 12. Statistical analysis per subcategory obtained from quantitative assessment (n = 47).
Subcategory and DescriptionParameters95% Confidence Interval
MSDSkwKmeμX1X2
AAAssessment method4.3620.810−1.2281.2210.2324.1304.593
ABAssessment development/organization3.5531.007−0.7720.2150.2883.2653.841
CAContent of the course3.9040.698−0.371−0.4320.2003.7054.104
CBCourse approach4.0000.684−0.3990.4760.1963.8044.196
CCTime/content rate3.0001.0720.000−0.5100.3062.6943.306
DADocuments4.3620.666−0.565−0.6520.1904.1714.552
DBVideos4.2300.872−1.4242.7090.2493.9814.480
PAClass development3.9150.745−0.467−0.1490.2133.7024.128
PBSystem and methodology3.7550.798−0.145−0.5380.2283.5273.983
PCSchedule-------
PDTechnology use-------
SAStudents’ motivation3.7120.793−1.3422.5720.2273.4853.938
SBStudents’ profit3.9400.823−0.6540.2040.2353.7044.175
TATeachers4.3190.631−0.630−0.5560.1804.1394.500
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Campanyà, C.; Fonseca, D.; Amo, D.; Martí, N.; Peña, E. Mixed Analysis of the Flipped Classroom in the Concrete and Steel Structures Subject in the Context of COVID-19 Crisis Outbreak. A Pilot Study. Sustainability 2021, 13, 5826. https://doi.org/10.3390/su13115826

AMA Style

Campanyà C, Fonseca D, Amo D, Martí N, Peña E. Mixed Analysis of the Flipped Classroom in the Concrete and Steel Structures Subject in the Context of COVID-19 Crisis Outbreak. A Pilot Study. Sustainability. 2021; 13(11):5826. https://doi.org/10.3390/su13115826

Chicago/Turabian Style

Campanyà, Carles, David Fonseca, Daniel Amo, Núria Martí, and Enric Peña. 2021. "Mixed Analysis of the Flipped Classroom in the Concrete and Steel Structures Subject in the Context of COVID-19 Crisis Outbreak. A Pilot Study" Sustainability 13, no. 11: 5826. https://doi.org/10.3390/su13115826

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop