Next Article in Journal
An Investigation of EAP Teachers’ Views and Experiences of E-Learning Technology
Next Article in Special Issue
Development and Psychometric Properties of a Scale to Measure Resilience among Portuguese University Students: Resilience Scale-10
Previous Article in Journal
Enhancing Music Industry Curriculum with Digital Technologies: A Case Study
Previous Article in Special Issue
Higher Education and Employability Skills: Barriers and Facilitators of Employer Engagement at Local Level
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Teaching Sentiment in Emergency Online Learning—A Conceptual Model

1
ISLA Santarém, Largo Candido dos Reis, 2000-241 Santarém, Portugal
2
Sport Sciences School of Rio Maior, Polytechnic Institute of Santarém, 2040-413 Rio Maior, Portugal
*
Author to whom correspondence should be addressed.
Educ. Sci. 2021, 11(2), 53; https://doi.org/10.3390/educsci11020053
Submission received: 13 January 2021 / Revised: 25 January 2021 / Accepted: 27 January 2021 / Published: 30 January 2021
(This article belongs to the Special Issue New Research and Trends in Higher Education)

Abstract

:
Due to the COVID-19 pandemic, higher education institutions with a face-to-face model have found themselves in the contingency of migrating to online learning. This study explores the perspective of all the lecturers at a Portuguese private higher education institution who were invited to participate, regardless of their research area, in this questionnaire. It aims to propose and test a conceptual model that combines attitudes, preferred activities, and technological experience with the sentiment about the impact of this experience on students’ learning process, on their teaching activity, and on the strategy of higher education institutions. An online questionnaire was conducted to 65 lecturers engaging in emergency online lecturing. The obtained results showed that lecturers reveal a positive attitude towards online lecturing, tend to prefer activities in which they feel most comfortable in face-to-face lecturing, and consider having technological experience useful for online activities. Lecturers have a positive sentiment about the impact of online learning on students’ learning, their faculty career, and the strategy of higher education institutions. The proposed conceptual model test shows that the model has well-fitting conditions. The results confirm the hypotheses formulated: namely, the predictive effect of attitude, preferred activities, and technological experience on sentiment. Faculty engagement in emergency online lecturing shows that the members are available to participate in the changing process, and the proposed conceptual model can be used to assess this readiness.

1. Introduction

The COVID-19 pandemic has affected higher education institutions (HEIs) in their activities in order to promote the protection of their lecturers, staff, and students in a public health emergency. The institutions had no alternative but to cancel all face-to-face lectures, including labs and other learning experiences, and to determine that lecturers completely switch the courses to emergency online learning, reducing contacts and thereby preventing the spread of the virus.
This teaching model that many call “emergency remote teaching” [1], includes the use of totally remote teaching solutions, mediated by the internet, to ensure activities that would otherwise be taught in a face-to-face form, returning to this format once the crisis or emergency is overcome [1]. The followed model seems similar to the online learning that has been stated by Anderson [2], referring to a teaching and learning type in which: (1) the student and the lecturer are at physical distance; (2) student–content, student–lecturer and student–student interactions are mediated by technology; and (3) some type of support is provided [2].
In the COVID-19 context, higher education lecturers were challenged by the need for the adoption of online learning practices, for which the majority were not prepared [3], and there were no indications that they were interested in using it [4]. The faculty members had to prepare and teach their lectures from home, with all the practical and technical challenges that this entails, and often without adequate technical support [1]. In addition to the lack of required online specific pedagogical competences, it is generally agreed that in a normal situation, the challenge to effectively transfer what is taught in a face-to-face classroom to an online version remains a problem [3]. Most of these lecturers, who normally develop their activities face-to-face, do not reveal an interest in online learning (only about 30% to 35% consider this option) [4,5]. This position is caused by the lack of motivation and incentives resulting from various obstacles that can be summarized as technological readiness [6,7,8], absence of organizational incentive to compensate for extra work [9,10], and the prejudices related to the value of online teaching [5,11,12].
In a normal situation, the most relevant motivations for adopting online learning are related to the concern of reaching new audiences, diversifying the HEI’s offer, and contributing to the management of organizational change and the positioning of the HEI´s offer in the context of online education [9,11,12,13].
In the emergency caused by COVID-19, lecturers needed, overnight, to use tools with which they felt comfortable [14]. Face-to-face lecturers thus needed to develop online teaching activities in order to avoid the collapse of the teaching and learning process. In this situation, lecturers adopted emergency remote teaching that. as stated by Hodges [1] (p. 6), “is a temporary shift of instructional delivery to an alternate delivery mode due to crisis circumstances”. Emergency online teaching is different from all other situations in which online teaching and learning activities are planned by lecturers who have online teaching skills. For many of these lecturers with little or no experience in online teaching, the option was to transport the typical activities they developed for face-to-face teaching to the online environment and, gradually, introduce activities that would allow more meaningful learning [15]. Despite the skills and support limitations, lecturers have a positive sentiment about emergency online learning [16,17].
The present investigation focuses on the motivations of lecturers with no, or little, experience in online teaching. Without any other option, these lecturers were required to adopt emergency online teaching. In order to address this great challenge, lecturers changed their attitude towards online education, their favorite activities, and technological experience. This study aims to investigate whether these feelings and skills affect online teaching sentiment. It aims to understand how lecturers perceive the impact of this experience on students’ learning, on their teaching activity, and in the development of HEI online learning strategy.
The document is organized into six sections: the present section, which introduces the research topic, the motivation, and the aim; the following section, which presents the conceptual model and hypothesis for the research; the methodology is then described, followed by the sections of the obtained results, its discussion, and final remarks in the conclusions.

2. Conceptual Model and Hypotheses

From the existing literature, several theories and models have emerged that have in common the objective of explaining the intention to use technologies through the relationship between latent, including external and outcome, variables [18,19] Although these models have been developed with the aim of explaining and predicting the acceptance of computer technologies in general, they have been adapted with a view for their application in more specific contexts, such as online teaching and learning [20,21].
Contrary to previous studies, this study is based on the migration from face-to-face to emergency online education. It was carried out without the lecturers involved having had any opportunity to carry out any type of training, and these had only minimal support. They were limited to providing access to the platforms and technologies used. For this study, a conceptual model is proposed that combines factors that can be measured when face-to-face lecturers have transferred their activities to emergency online learning, namely: (1) online teaching attitude (OTA); (2) preferred online activities (POA); (3) technological experience (TEX); and (4) online teaching sentiment (OTS).

2.1. Online Teaching Attitude (OTA)

The attitude towards online teaching and learning is identical to that shown in other pre-pandemic studies [18,19,20,21]. It consists of appraising individuals’ positive or negative feelings (evaluative affect) about the use of online education [9,18,19]. The following two hypotheses are proposed:
Hypothesis 1 (H1). 
OTA positively affects OTS.
Hypothesis 2 (H2). 
OTA positively affects POA.

2.2. Preferred Online Activities (POA)

The activities proposed by lecturers in emergency online learning, with which most did not have previous experience, ended up following those recommended in the existing literature. They choose to diversify the activities and the materials used, thus seeking to correspond to the different student learning profiles [22,23]. The activities preferred by lecturers when migrating activities to emergency online teaching can be compared with the concept of self-efficacy. According Jo et al. [24] (p. 50), “self-efficacy reports to lecturers’ personal beliefs about their abilities and skills”. It seems normal that lecturers prefer the activities in which they feel more qualified and competent. Thus, Hypothesis 3 (H3) is suggested: POA positively affects OTS.

2.3. Technological Experience (TEX)

Technological experience identifies the degree of technological readiness [25] of the lecturers from their perspective [26]. As mentioned by Abdullah and Ward [27], experience plays an important role in the adoption of online education and can be defined as “the amount and type of computer skills acquired by a person over time” [27] (p. 34). For Joo et al. [24], “it is important for lecturers to have enough time and opportunities to practice new technologies until they feel comfortable enough to use the technology and perceive that technology”. In a context in which lecturers did not have that time, technological experience seems to be an important factor that can influence online teaching sentiment [28]. The following three hypotheses are proposed:
Hypothesis 4 (H4). 
TEX positively affects OTS.
Hypothesis 5 (H5). 
TEX positively affects OTA.
Hypothesis 6 (H6). 
TEX positively affects POA.

2.4. Online Teaching Sentiment (OTS)

According to Liu [29] (p.15), “sentiment is the underlying feeling, attitude, evaluation, or emotion associated with an opinion”, which is represented by three aspects: the type, orientation, and intensity of the sentiment. In the context of this work, the lexicon-based approach that involves calculating the orientation of feeling from the semantic orientation of words or phrases was used. The orientation of the sentiment can be positive, neutral, or negative. Neutral means the absence of sentiment or no sentiment or opinion [29,30]. Sentiment intensity is an important aspect for the classification of the feeling associated with a sentence [31]. For example, “good is weaker than excellent, and dislike is weaker than detest” [29] (p. 16).
Sentiment analysis is studied in many different contexts, with machine learning and natural language processing being the most common techniques [32]. In the current research, sentiment analysis was based on processing natural language and extracting information that examine phrases and assign to each one of them a sentiment polarity (positive, negative, neutral) [29,33]. By this way, the opinions expressed by lecturers in relation to the impact of the online emergency teaching was assessed in three aspects: (1) impact on students’ learning; (2) impact on their future teaching activity; (3) impact on the future HEI online learning strategy.
Based on the previous theoretical variables, the conceptual model with the relationships between all the factors that influence OTS is presented in Figure 1.

3. Methodology

3.1. Participants

The participants (n = 65) were lecturers form a Portuguese private HEI. This HEI has a total of 98 lecturers that were invited to participate in the questionnaire. The link to questionnaire was sent to everyone through e-mail message, along with an introduction about the research objectives.

3.2. Data Colletion

The data were collected through online surveys from April to May 2020. The aggregated response rate was 79%, and the final sample consisted of 66% of the reference population. From 98 potential respondents, 78 questionnaires were answered by respondents, of which 13 were rejected because of missing values.

3.3. Lecturers’ Personal Information/Demographic Data

In the total of sample of lecturers, the percentage of females was 40%, while that of males was 60%. A total of 1.5% of lecturers were up to 29 years of age, 13.8% from 30 to 39 years of age, 46.2% between 40 to 49 years of age, 21.5% between 50 to 59 years of age, and 16.9% were 60 years of age or older. In terms of the academic qualifications of the lecturers, 23.1% of participants held bachelor’s degrees, 33.8% held master’s degrees, and 43.1% held a doctoral degree. The teaching experience shows that 18.5% had up to 4 years, 20% had from 5 to 9 years, 24.6% had 10 to 19 years of experience, and 36.9% disclosed 20 or more years of experience.

3.4. Survey Instrument and Structure

The questionnaire consisted of six sections. The first section intended to characterize the respondents. In the second section, respondents were asked about their attitude toward online teaching and learning with a 5-point Likert scale (1—lower; 2—sometimes lower; 3—no significant differences; 4—sometimes higher; 5—higher). The third section was to evaluate the degree of preference/satisfaction with the online activities. A 10-point end defined scale with ratings from null (1) to high (10) was chosen, in order to produce increased sensitivity of the measurement instrument [34]. In the fourth section, respondents were asked to self-assess their technology skills. A 4-point Likert scale was adopted (1—none; 2—up to 3 years; 3—from 3 to 6 years; 4—more than 6 years).
The fifth section of the questionnaire survey presents three open questions about the impact of emergency online teaching and learning in the present and in the future of (1) student´s learning, (2) teaching activities, and (3) online learning and teaching in HEI strategy. These questions are intended to collect data for sentiment analysis about online learning and teaching. Table 1 presents the constructs of each section and the sources which inspired them.

3.5. Pilot Study for the Questionnaire

A pilot study was conducted to check the reliability of the questionnaire items. The sample size was set based on 20% of the aggregated sample size of this study (98 lecturers) and thus adhered strictly to the research criteria. Cronbach’s alpha test was utilized for the computation of internal reliability [40] through IBM SPSS Statistics v26, in order to judge the outcomes of the pilot study. A value of 0.7 was taken to be an acceptable value for the reliability coefficient, considering the model for social science research [41,42,43]. The appropriate findings are shown in Table 2.

3.6. Sentiment Analysis

The fifth section of the questionnaire presents three open questions about the impact of emergency online teaching and learning in the future of (1) student´s learning, (2) teaching activities, and (3) online learning and teaching in HEI strategy. These questions are intended to collect data for sentiment analysis towards online learning and teaching.
There are many applications and enhancements on sentiment analysis algorithms that have been proposed in the last few years [33]. For this work the OpLexicon 3.0 was used. It is a sentiment lexicon for the Portuguese language, built using multiple sources of information, and has four categories of words: verbs, adjectives, hashtag, and emoticons. The lexicon is constituted of around 32,000 polarized words classified by their morphological category and annotated with positive (1), negative (−1), and neutral (0) polarities [30,38].
The sentiment analysis was developed in R [44] following the following steps represented in Figure 2: (1) the words are extracted from each answer of the open questions in the questionnaire; (2) verification of whether the word is present in the OpLexicon and determination of the polarity; (3) the sum of the polarity of the word in the answer is determined; and the final step is (4) to convert the sum of polarity to a Likert scale.
The conversion to a Likert scale was based in the following Algorithm 1, where each answer is processed after the determination of the cut points (median values) used to convert to a scale aligned with the other questions of the survey:
Algorithm 1 Likert Calculation
1: Input: polarity of the open questions
2: Output: likert values for open questions
3: Begin: likertCalculation
4: assign median(negative answer polarity) to mNegSent
5: assign median(positive answer polarity) to mPosSent
6: assign 0 to answerPolarity
7: for each answer do
8: if answerPolarity <= mNegSent then
9:          answerLikertScale = 1
10: else if answerPolarity > mNegSent and answerPolarity < 0
11:          answerLikertScale = 2
12: else if answerPolarity = 0
13:          answerLikertScale = 3
14: else if answerPolarity <= mPosSent and answerPolarity > 0
15:          answerLikertScale = 4
16: else if answerPolarity > mPosSent and answerPolarity > 0
17:          answerLikertScale = 5
18: End: likertCalculation
As an example, considering the opinion “I consider that my adaptation was made in a smooth way”, the next step is the processing of each word: “I (1) consider (2) that (3) my (4) adaptation (5) was (6) made (7) in (8) a (9) smooth (10) way (11)”. To determine the polarity of each word, OpLexicon 3.0 was used. In the example given, only the word “smooth” (10) returns value 1 (positive polarity) from OpLexicon; all the other words do not have an associated polarity, returning “word is not present in dataset”. The algebraic sum of the returned values is 1. Consequently, this answer would get a polarity value of 1. After performing this step, an algorithm is developed following the “Likert calculation”, calculating the median of the negative and positive words in each question: (1) negative values less or equal to the negative values median were assigned one, (2) negative values less than zero and greater than median were assigned two, (3) 0 (neutral) was assigned three, (4) positive values and less than positive median were assigned four, and positive values greater than positive mean wrtr assigned five. The null values were replaced by 0 representing the absence of an answer.

3.7. SPSS and SmartPLS 3

The demographic data was evaluated with the aid of IBM SPSS Statistics v26. SmartPLS 3 software was used with a graphical user-interface to estimate the PLS-SEM models [45]. This tool can cope with smaller sample size (<100), non-normal data, exploratory research for the same effect size and model complexity, and it can more easily specify formative constructs [46,47].

3.8. Adjustment Quality for the SEM Model

The following fit measures were considered to assess the adjustment quality of the model:
  • Loadings. For a well-fitting model, path loadings should be above 0.70 and “indicator with a measurement loading in the 0.40 to 0.70 range should be dropped if dropping it improves composite reliability” [46] (p. 103). Having tested this option, the conditions were not met, and the items were not dropped.
  • Variance inflation factor (VIF). Indicates multicollinearity. In a well-fitting model, the structural VIF coefficients should not be higher than 5 [48].
  • Cronbach Alpha (CA). George and Mallery [49] suggest the following scale: >0.90 “Excellent”, >0.80 “Good”, 0.70 “Acceptable”, >0.60 “Questionable”, >0.50 “Poor” and <0.50 “Unacceptable”.
  • Composite reliability (CR). Values between 0.70 and 0.90 are considered satisfactory [46].
  • R-square. Results above the cut-offs 0.67, 0.33, and 0.19 to be “substantial”, “moderate”, and “weak”, respectively [46].
  • Average variance extracted (AVE). Greater than 0.50 means that the model converges with a satisfactory result (AVE > 0.50) [50].
  • Discriminant validity (DV). The square roots of the AVEs should be greater than the correlations of the constructs [51].
  • F-square. Values of 0.02 represents a “small” effect, 0.15 represents a “medium” effect, and 0.35 represents a “high” effect size [46].
The values presented in Table 3, Table 4 and Table 5 show that the model has well-fitting conditions.
Finally, evaluating the predictive validity or Stone–Geisser indicator for the accuracy of the adjusted model. Q2 > 0 implies the model has predictive relevance [46,52] (Table 6).

4. Results

4.1. Online Teaching Attitude

The results showed that respondents have a positive attitude towards online teaching. The item “I have the same availability for online as for face-to-face teaching “(OTA3) has an average of 3.71, while the item “quality of online education in relation to face-to-face education” (OTA1) has 3.25, and the item “I like online education in the same way as face-to-face education” (OTA2) has an average of 3.14 (Table 7).

4.2. Preferred Online Activities

Lecturers revealed greater preference for “online sessions” (POA4) with a mean of 8.48, “oral presentations” (POA3) with 7.66, and “written assignments” (POA2) with 7.34 (Table 8).

4.3. Technological Experience

Respondents showed high experience in the use of “online meeting systems” (TEX1), with an average of 3.94. The remaining items evaluated obtained average values above 3.0 (Table 9).

4.4. Sentiment Analysis

The results of sentiment analysis of the open questions allowed the identification of their sentiment value, as exemplified in Table 10, for impact on lecturers’ careers.
The impact of online learning on students’ learning has approximately 9 responses with a negative sentiment (14%), as well as a neutral sentiment with 14 answers (21%), and 30 responses with a positive sentiment (46%). The sentiment in relation to teaching activities has 11 responses with a negative sentiment (17%), 11 answers of neutral sentiment (17%). and 31 positive sentiment responses (47%). In relation to the higher education institution, there are 3 answers with a negative sentiment (5%), 12 responses of neutral sentiment (18%), and 34 positive sentiment responses (52%).
The opinion in relation to the impact of online learning in the institution strategy is the one with a higher percentage of positive sentiment (52%), as opposed to 5% who expressed positive sentiment. The opinion in relation to teaching activities has the highest percentage of positive sentiment (17%) as well as neutral sentiment. The overall sentiment distribution is represented in Figure 3.

4.5. PLS Analysis

The path coefficients of the prediction model were positive in POA (0.390), and they were negative in OTA (−0.169) to the latent variable of OTS. TEX coefficients to the prediction model were positives to the latent variables of OTA and POA. These results show that TEX has direct and indirect (via OTA (0.160) and POA (0.427)) effects on OTS.
The model also presented OTS1 (student online learning) (0.772), OST2 (teaching career development) (0.866), and OTS3 (online learning in HEI) (0.867), which had positive path coefficients to OTS (Figure 4).
Specific indirect effects are show in the Table 11.

5. Discussion

A questionnaire was conducted with the participation of 66% (n = 65) of all lecturers (98) for a Portuguese private HEI who developed their activities in an emergency online teaching environment. The study examined their attitude toward online teaching, what online activities they most value, and investigated whether technological experience influences these attitude and preferences. The opinions of these lecturers in relation to emergency online teaching, namely their impact on students’ learning, their professional development, and the development of HEI strategy was also examined. Finally, a conceptual model was proposed and tested to assess the effect of attitudes, activities, and technological experience on online teaching sentiment. In the following points, the results obtained in relation to the previous literature are discussed.

5.1. Attitude toward Online Teaching

The results showed that lecturers have a positive attitude towards emergency online teaching, showing an identical availability to face-to-face teaching. This conclusion coincides with other studies conducted in an emergency online teaching that show that lecturers report more on the advantages of distance education [53]. This is reinforced by the results obtained in the analysis of the impact of online teaching sentiment on teaching and students’ learning.
Based on this conclusion, at least in an emergency situation, lecturers do not question the value of online teaching. Although this is not the same type of education, these conclusions are more positive than the results obtained in a normal situation when questioning face-to-face lectures about their availability and acceptance of online teaching [5,11].

5.2. Preferred Activities

The most preferred activities of lecturers (“online sessions”, “oral presentations”, and “written assignments”) confirm the García-Peñalvo et al. study [15] and reveal that lecturers relied on the “tools” they dominated and only later did they begin to use resources more adjusted to online teaching and learning. This strategy is confirmed by Rapanta et al. [14], who state that many non-specialist online lecturers have chosen to focus on materials/resources that they would use anyway to teach the course content, regardless of whether they are face-to-face or online.
Despite the difficulties related to the emergency online teaching that cannot be compared with “normal” online teaching, some of the options found can be problematized. However, as concluded by Spoel et al. [54], there was the attempt to provide students with the basic ingredients for learning (online lectures, group activities, discussion forums, etc.) that reveal concern with diversification, thus seeking to correspond to the different student learning profiles [22,23].
This adaptability seems to confirm Anderson in that “an excellent e-teacher is an excellent teacher” [2] (p.360), possessing pedagogical skills that allow them to understand the teaching process, in order to be able to make the best use of the range of activities they have at their disposal.

5.3. Technological Experience

Pre-pandemic studies [7,8,10] show that technological readiness can be a factor that conditions the participation of lecturers in online teaching. Although these conclusions cannot be directly transposed to emergency online education, results show that the participants in this study had technological experience in some of the tools for the development of online activities.

5.4. Sentiment Analysis

Lecturers have a positive or neutral sentiment about the impact of emergency online learning on students’ learning. These findings are similar to others, where it was concluded that lecturers expressed a favorable opinion about the students’ academic performance during the COVID-19 pandemic outbreak [16,17]. The findings of this study are slightly more positive than the results reported by Tartavulea et al. [55], which concluded that emergency online teaching has an overall moderate positive impact on the educational process, albeit the overall effectiveness of the online educational experience is perceived to be lower than in the case of face-to-face teaching.
Likewise, lecturers expressed a neutral or positive sentiment regarding the impact of emergency online teaching on their professional activity. In addition to showing high availability for online teaching, lecturers do not refer to the eventual need for compensation for the required additional work caused by transposition of face-to-face to emergency online teaching, as studies about online teaching reveal [9].
Lecturers thus seem to prefer to take advantage of the professional development opportunity that the situation offers [4]. These conclusions reveal a positive stance that HEIs that intend to invest in online teaching strategies cannot miss. Studies carried out in a pandemic situation have not focused on this aspect, so it is not possible to make comparisons with similar situations. Despite this, there is pre-pandemic literature that shows that lecturers do not consider online teaching as having a positive impact on their careers [4,13].
The results verified in the sentiment analysis about the impact of emergency online teaching for the future development of the HEI are in line with other studies which were carried out outside the emergency context, and where the contribution to organizational change and positioning of the HEI offer are the aspects most frequently pointed out by lecturers on the adoption online teaching [9,11,12,13]. The extra time and effort invested by lecturers in emergency online teaching can explain the positive perception regarding the impact on HEI strategy [54].

5.5. Conceptual Model

The results of the conceptual model test show that the model has well-fitting conditions. In relation to each of the tested hypotheses it is concluded that five of the six hypotheses have been confirmed (Table 12). The obtained values show that the effect of POA and EXT on OTS, and TEX on POA are strong (>0.35), while the effects of OTA and TEX on POA are moderate (>0.15) [56].

5.6. Limitations

Some limitations of the present study must be highlighted. First, the study was carried out in an institution with 98 lecturers, of which 66% submitted valid responses (no missing data). The sample size (n = 65) represents the HEI population, but with all respondents belonging to a single HEI, the study does not allow generalizing the results for Portuguese HEIs.
Another limitation of the study is the fact that the results are based on the respondents’ perceptions, which may cause a bias. Although it was clarified that the survey results would only be used for the purposes of the survey, respondents may be tempted to choose the “correct” answer or the more socially desired answer, thus being vulnerable to distortions [42].
The way in which the transition from face-to-face to the emergency online teaching was carried out may justify why lecturers expressed a greater degree of preference for lectures (online sessions). This preference, by itself, could indicate that they merely transposed the “bad” face-to-face practices to the online environment, namely the face-to-face expository sessions. However, despite this greater preference, there is a significant degree of adherence to other activities, namely oral presentations, written assignments (in group), discussion forums, and chat. The diversity and characteristics of these activities can enhance student–lecturer or student–student interaction, leaving good indications about teaching and learning process [2].
The conditions available were certainly not the same in all institutions, just as they are not the same in the face-to-face context. These differences may have affected, to a greater or lesser extent, the quality of the solutions adopted and should be considered as a moderating factor when extending the study to other HEIs.

6. Conclusions

After the emergency online teaching experiences related to COVID-19 pandemic situation, lecturers acquired an experience that will mark their teaching life forever. As the storm passes and face-to-face classes are resumed in a normal environment, HEIs can expect less resistance and more enthusiasm for online teaching from their lecturers [3,4]. So that this enthusiasm does not fade away, it will be necessary to support the training of lecturers by providing them with the skills and competences they require to act in the context of online education. Hybrid approaches integrating online teaching with face-to-face activities can represent a significant improvement when many studies reveal that online education constitutes a key factor for the development of HEIs [4].
This work only reflects the perspective of the lecturers. In parallel, another study is being carried out that will reflect the students’ perspective and that will allow a comparison between the two perspectives to be established.
Further research, ideally expanding the sample size with participation of lecturers from different HEIs, is required to verify whether the proposed model continues to maintain theoretical validity. In the same way, this extension will allow confirmation of the findings.

Author Contributions

Conceptualization, D.M. and P.S.; methodology, D.M.; software, D.M. and P.S.; validation, D.M., P.S. and R.V.; formal Analysis, D.M. and P.S.; investigation, D.M. and P.S.; resources, R.V.; data curation, D.M. and P.S.; writing–original draft preparation, D.M., P.S. and R.V.; supervision, D.M.; project administration, R.V. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and has obtained prior approval of the Ethic Committee of the ISLA Santarém (2020-002).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hodges, C.; Moore, S.; Lockee, B.; Trust, T.; Bond, A. The Difference between Emergency Remote Teaching and Online Learning. Educ. Rev. 2020, 27, 1–12. [Google Scholar]
  2. Anderson, A. Teaching in an Online Learning Context. In The Theory and Pratice of Online Learning; AU Press: Edmonton, AB, Canada, 2008; pp. 343–365. [Google Scholar]
  3. Kebritchi, K.; Lipschuetz, A.; Santiague, L. Issues and Challenges for Teaching Successful Online Courses in Higher Education: A Literature Review. J. Educ. Technol. Syst. 2017, 6, 4–29. [Google Scholar] [CrossRef]
  4. Allen, I.E.; Seaman, J. Online Report Card, Tracking Oonline Education in the United States; Bason Survey Research Group: Bason Park MA, USA, 2016. [Google Scholar]
  5. Herman, J.H. Faculty Incentives for Online Course Design, Delivery and Professional Development. Innov. High Educ. 2013, 38, 397–410. [Google Scholar] [CrossRef]
  6. Oncu, S.; Cakir, H. Research in online learning environments: Priorities and methodologies. Comput. Educ. 2010, 57, 1098–1108. [Google Scholar] [CrossRef]
  7. Rienties, B.; Brouwer, N.; Lygo-Baker, S. The effects of online professional development on higher education teachers beliefs and intentions towards learning facilitation and technology. Teach. Teach. Educ. 2013, 29, 122–131. [Google Scholar] [CrossRef] [Green Version]
  8. Cubeles, A.; Riu, D. The effective integration of ICTs in universities: The role of knowledge and academic experience of professors. Technol. Pedagog. Educ. 2018, 27, 339–349. [Google Scholar] [CrossRef]
  9. Wingo, N.P.; Ivankova, N.V.; Moss, J.A. Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learn. 2017, 21, 15–35. [Google Scholar] [CrossRef]
  10. Orr, R.; Williams, M.R.; Pennington, K. Institutional Efforts to Support Faculty in Online Teaching. Innov. High Educ. 2009, 34, 257–268. [Google Scholar] [CrossRef]
  11. Luongo, N. An Examination of Distance Learning Faculty Satisfaction Levels and Self-Perceived Barriers. J. Educ. Online 2018, 15, n2. [Google Scholar] [CrossRef]
  12. Cook, R.G.; Ley, K.; Crawford, C.; Warner, A. Motivators and Inhibitors for University Faculty in Distance and e-learning. Br. J. Educ. Technol. 2009, 40, 149–163. [Google Scholar] [CrossRef]
  13. Martinho, D. O Ensino Online nas Instituições de Ensino Superior Privado. As perspetivas: Docente e discente e as im-plicações na tomada de decisão institucional. Doctoral Thesis, Lisbon University, Lisbon, Portugal, 2014. [Google Scholar]
  14. Rapanta, C.; Botturi, B.; Goodyear, B.; Guàrdia, l.; Koole, M. Online University Teaching During and After the Covid-19 Crisis: Refocusing Teacher Presence and Learning Activity. Postdigit. Sci. Educ. 2020, 2, 923–945. [Google Scholar] [CrossRef]
  15. García-Peñalvo, F.J.; Corell, A.; Abella-García, V.; Grande-de-Prado, M. Recommendations for Mandatory Online Assessment in Higher Education during the COVID-19 Pandemic. In Radical Solutions for Education in a Crisis Context; Lecture Notes in Educational Technology; Burgos, D., Tliti, A., Tabacco, A., Eds.; Springer: Singapore, 2021; pp. 85–98. [Google Scholar] [CrossRef]
  16. Alhumaid, K.; Ali, S.; Waheed, A.; Zahid, E.; Habes, M. COVID-19 & Elearning: Perceptions &Attitudes of Teachers Towards ELearning Acceptance in The Developing Countries. Multicult. Educ. 2020, 6, 100–115. [Google Scholar]
  17. Dhawan, S. Online Learning: A Panacea in the Time of COVID-19 Crisis. J. Educ. Technol. Syst. 2020, 49, 5–22. [Google Scholar] [CrossRef]
  18. Davis, F. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef] [Green Version]
  19. Venkatesh, V.; Morris, M.G.; Gordon, G.; Davis, F. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 17, 425–478. [Google Scholar] [CrossRef] [Green Version]
  20. Liu, I.; Chen, M.; Sun, Y.; Wiblie, D.; Kuo, C.-H. Extending the TAM model to explore the factors that affect intention to use an online learning comunnity. Comput. Educ. 2010, 54, 600–610. [Google Scholar] [CrossRef]
  21. Cheung, R.; Vogel, D. Predicting user acceptance of collaborative technologies: An extension of the technology acceptance model for e-learning. Comput. Educ. 2013, 63, 160–175. [Google Scholar] [CrossRef]
  22. Garrison, R. E-Learning in the 21st Century a Framework for Research and Practice; Taylor & Francis: New York, NY, USA, 2011; ISBN 978-0-203-83876-1. [Google Scholar]
  23. Beetham, H.; Sharpe, R. An introduction to rethinking pedagogy for a digital age. In Rethinking Pedagogy for a Digital Age; Routledge: Oxon, UK, 2007; pp. 1–10. ISBN 978-0-203-96168-1. [Google Scholar]
  24. Joo, Y.J.; Park, S.; Lim, E. Factors Influencing Preservice Teachers’ Intention to Use Technology: TPACK, Teacher Self-efficacy, and Technology Acceptance Model. Educ. Technol. Soc. 2018, 21, 48–59. [Google Scholar]
  25. Li, C.; Garza, V.; Keicher, A.; Popov, V. Predicting High School Teacher Use of Technology: Pedagogical Beliefs, Technological Beliefs and Attitudes, and Teacher Training. Technol. Knowl. Learn. 2019, 24, 501–518. [Google Scholar] [CrossRef]
  26. Walters, S.; Grover, K.S.; Turner, R.C.; Alexander, J.C. Faculty Perceptions Related to Teaching Online: A Starting Point for Designing Faculty Development Initiatives. Turk. Online J. Distance Educ. 2017, 18, 4–19. [Google Scholar] [CrossRef]
  27. Abdullah, F.; Ward, R. Developing a General Extended Technology Acceptance Model for E-Learning (GETAMEL) by analysing commonly used external factors. Comput. Hum. Behav. 2016, 56, 238–256. [Google Scholar] [CrossRef]
  28. Rizun, M.; Strzelecki, A. Students’ Acceptance of the COVID-19 Impact on Shifting Higher Education to Distance Learning. Int. J. Environ. Res. Public Health 2020, 17, 6468. [Google Scholar] [CrossRef] [PubMed]
  29. Liu, B. Many Facets of Sentiment Analysis. In A Practical Guide to Sentiment Analysis; Cambria, E., Das, D., Bandyopadhyay, S., Feraco, A., Eds.; Socio-Affective Computing; Springer: Cham, Switzerland, 2017; Volume 5, pp. 11–40. [Google Scholar] [CrossRef]
  30. Taboada, M.; Brooke, J.; Tofiloski, M.; Voll, K.; Stede, M. The lexicon-based approach involves calculating the orientation of feeling from the semantic orientation of words or phrases. Comput. Linguist. 2012, 37, 267–307. [Google Scholar] [CrossRef]
  31. Qian, Q.; Huang, M.; Lei, J.; Zhu1, X. Linguistically Regularized LSTM for Sentiment Classification. In Proceedings of the 55th Annual Meeting of the Association for Computational, Vancouver, BC, Canada, 30 July–4 August 2017. [Google Scholar] [CrossRef] [Green Version]
  32. Kanavos, A.; Nodarakis, N.; Sioutas, S.; Tsakalidis, A.; Tsolis, D.; Tzimas, G. Large Scale Implementations for Twitter Sentiment Classification. Algorithms 2017, 10, 33. [Google Scholar] [CrossRef]
  33. Medhat, W.; Hassan, A.; Korashy, H. Sentiment analysis algorithms and applications: A survey. Ain Shams Eng. J. 2014, 5, 1093–1113. [Google Scholar] [CrossRef] [Green Version]
  34. Cummins, R.; Gullone, E. Why we should not use 5-point Likert scales: The case for subjective quality of life measurement. In Proceedings of the Second International Conference on Quality of Life in Cities, Singapore, 8–10 March 2000; National University of Singapore: Singapore, 2000; pp. 74–93. [Google Scholar]
  35. Meyer, K.A.; Murrell, V.S. A National Study of Training Content and Activities for Faculty Development for Online Teaching. J. Asynchronous Learn. Netw. 2014, 18, n1. [Google Scholar] [CrossRef]
  36. Ko, S.; Rossen, S. Teaching Online: A Practical Guide; Routledge: New York, NY, USA, 2017; ISBN 978-0-203-42735-4. [Google Scholar]
  37. Calvin, J.; Freeburg, B.W. Exploring adult learners’ perceptions of technology competence and retention in web-based courses. Q. Rev. Distance Educ. 2010, 11, 63–72. [Google Scholar]
  38. Souza, M.; Vieira, R. Sentiment Analysis on Twitter Data for Portuguese Language. In Computational Processing of the Portuguese Language; Springer: Berlin, Germany, 2012; pp. 241–247. [Google Scholar] [CrossRef]
  39. Rani, S.; Kumar, P. A Sentiment Analysis System to Improve Teaching and Learning. Computer 2017, 50, 36–43. [Google Scholar] [CrossRef]
  40. Marôco, J.; Garcia-Marques, T. Qual a fiabilidade do alfa de Cronbach? Questões antigas e soluções modernas? Laboratório de Psicologia 2006, 4, 65–90. [Google Scholar] [CrossRef] [Green Version]
  41. Marôco, J. Análise Estatística com SPSS Statistics; ReporterNumber: Lisboa, Portugal, 2018; ISBN 9789899676350. [Google Scholar]
  42. Hill, M.M.; Hill, A. Investigação por Questionário; Silabo: Lisboa, Portugal, 2008; ISBN 9789726182733. [Google Scholar]
  43. Hair, J.F.; Tatham, R.; Anderson, R.; Black, W. Análise Multivariada de Dados; Bookman: São Paulo, Brazil, 2009; ISBN 9788577804023. [Google Scholar]
  44. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2020. [Google Scholar]
  45. Ringle, C.; Silva, D.D.; Bido, D. Structural Equation Modeling with the SmartPLS. Braz. J. Mark. 2015, 13. [Google Scholar] [CrossRef]
  46. Hair, J.F., Jr.; Sarstedt, M.; Hopkins, L.; Kuppelwieser, V.G. Partial least squares structural equation modeling (PLS-SEM) An emerging tool in business research. Eur. Bus. Rev. 2014, 26, 106–121. [Google Scholar] [CrossRef]
  47. Hou, H.-Y.; Lo, Y.-L.; Lee, C.-F. Predicting Network Behavior Model of E-Learning Partner Program in PLS-SEM. Appl. Sci. 2020, 10, 4656. [Google Scholar] [CrossRef]
  48. Marôco, J. Análise de Equações Estruturais; ReportNumber: Lisboa, Portugal, 2010; ISBN 9789899676336. [Google Scholar]
  49. George, D.; Mallery, P. IBM SPSS Statistics 26 Step by Step: A Simple Guide and Reference; Routledge: New York, NY, USA, 2019; ISBN 9780367174354. [Google Scholar]
  50. Chin, W.W.; Newsted, P.R. Structural Equation Modeling Analysis with Small Samples Using Partial Least Square. In Statistical Strategies for Small Sample Research; Hoyle, R., Ed.; SAGE Publications: Thousand Oaks, CA, USA, 1999; pp. 307–371. [Google Scholar]
  51. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  52. Chin, W.W. The Partial Least Squares Approach for Structural Equation Modeling. In Modern Methods for Business Research; Lawrence Erlbaum Associates: New York, NY, USA, 1998; pp. 295–336. [Google Scholar]
  53. Hebebci, M.T.; Bertiz, Y.; Alan, S. Investigation of Views of Students and Teachers on Distance Education Practices during the Coronavirus (COVID-19) Pandemic. Int. J. Technol. Educ. Sci. 2020, 4, 267–282. [Google Scholar] [CrossRef]
  54. Spoel, I.; Noroozi, O.; Schuurink, E.; Ginkel, S. Teachers’ online teaching expectations and experiences during the Covid19-pandemic in the Netherlands. Eur. J. Teach. Educ. 2020, 43, 623–638. [Google Scholar] [CrossRef]
  55. Tartavulea, C.V.; Albu, C.N.; Albu, N.; Dieaconescu, R.; Petre, S. Online Teaching Practices and the Effectiveness of the Educational Process in the Wake of the COVID-19 Pandemic. Amfiteatru Econ. 2020, 22, 920–936. [Google Scholar] [CrossRef]
  56. Henseler, J.; Hubona, G.; Ray, P.A. Using PLS path modeling in new technology research: Updated guidelines. Ind. Manag. Data Syst. 2016, 116, 2–20. [Google Scholar] [CrossRef]
Figure 1. Conceptual model.
Figure 1. Conceptual model.
Education 11 00053 g001
Figure 2. Flow-chart representing the determination of the polarity of the open questions.
Figure 2. Flow-chart representing the determination of the polarity of the open questions.
Education 11 00053 g002
Figure 3. Frequency of the sentiment identified.
Figure 3. Frequency of the sentiment identified.
Education 11 00053 g003
Figure 4. Partial least squares structure model (inner path coefficients and outer weights).
Figure 4. Partial least squares structure model (inner path coefficients and outer weights).
Education 11 00053 g004
Table 1. Constructs and their inspiration sources.
Table 1. Constructs and their inspiration sources.
SectionConstructsNumber of ItemsSource
2Online teaching attitude (OTA)3[18,19]
3Preference online activities (POA)5[35,36]
4Technological experience (TEX)3[25,26,37]
5Online teaching sentiment (OTS)3 (*)[38,39]
(*) Open questions.
Table 2. Cronbach´s alpha value for pilot study.
Table 2. Cronbach´s alpha value for pilot study.
Alfa de CronbachNumber of Items
0.79211
Table 3. Adjustment quality for the Structural Equation Modeling SEM model.
Table 3. Adjustment quality for the Structural Equation Modeling SEM model.
ConstructsItemsLoadingsVIFCACRR-SquareAVE
OTAOTA10.8851.9500.8400.9020.0260.755
OTA20.9002.238
OTA30.8191.867
POAPOA10.6091.7590.8020.8480.3000.557
POA20.8912.663
POA30.8341.933
POA40.7181.586
POA50.6421.622
TEXTEX10.7681.3030.6860.687--0.615
TEX20.7531.317
TEX30.8301.531
OTSOTS10.7721.5800.7890.8210.1550.699
OTS20.8661.879
OTS30.8671.626
Table 4. Discriminate validity.
Table 4. Discriminate validity.
OTAOTSPOATEX
OTA0.869
OTS−0.0910.836
POA0.1600.1570.747
TEX0.351−0.2110.4720.784
Diagonal values (in bold) are Composite reliability (CR).
Table 5. F-square.
Table 5. F-square.
OTAOTSPOATEX
OTA-0.0300.111-
OTS----
POA-0.126--
TEX0.0260.1250.254-
Table 6. Predictive validity (Q2).
Table 6. Predictive validity (Q2).
ConstructsSSOSSEQ2 = 1 − (SSE/SSO)
OTA195.000195.0000
OTS195.000180.2840.075
POA325.000325.0000
TEX195.000195.0000
SSO—sum of squares errors using mean for prediction; SSE—sum of squares prediction error.
Table 7. Online teaching attitude.
Table 7. Online teaching attitude.
Item CodItemMeansSD *
OTA3I have the same availability for online as for face-to-face teaching3.710.85
OTA1Quality of online education in relation to face-to-face education3.250.98
OTA2I like online education in the same way as face-to-face education3.141.12
(*) Standard-deviation.
Table 8. Preferred online activities.
Table 8. Preferred online activities.
Item CodItemMeansSD *
POA4Online sessions (Zoom, Teams, etc.)8.481.44
POA3Oral presentations7.661.85
POA2Written assignments (in group)7.342.26
POA1Discussion Forums6.882.09
POA5Chat Activities6.482.20
(*) Standard-deviation.
Table 9. Technological experience.
Table 9. Technological experience.
Item CodItemMeansSD *
TEX1Online meeting systems (Zoom, Teams, etc.)3.940.24
TEX3Online learning environments (Moodle, etc.)3.290.84
TEX2Collaborative work tools (Google Drive, etc.)3.180.91
(*) Standard-deviation.
Table 10. Example of qualitative sentiment for the impact on lecturer´s careers.
Table 10. Example of qualitative sentiment for the impact on lecturer´s careers.
Portuguese (English *)SentimentLikert Value
Enquanto docente, esta foi a minha primeira experiência no ensino à distância. Considero que a minha adaptação se efetuou de uma forma tranquila. De relevar que é necessário adotar abordagens mais exigentes na preparação das aulas. Requer a utilização de formas adicionais para captar a atenção do estudante e de os motivar. Aula após aula a assiduidade melhorou significativamente. (As a lecturer, this was my first experience in distance learning. I believe that my adaptation took place in a calm way. It is important to note that it is necessary to adopt more demanding approaches in class preparation. It requires the use of additional ways to capture the student’s attention and motivate him/her. After lecture attendance has improved significantly. *)55
Maior flexibilidade/disponibilidade e novas aprendizagens. Maior preparação para futuras situações ou oportunidades. (Greater flexibility/availability and new learning. Greater preparation for future situations or opportunities.)14
É o mesmo. (It is the same. *)03
O formato de ensino online é mais difícil para o professor do que o formato presencial. A preparação e logística das aulas online é maior do que para presenciais, bem como o tratamento que é necessário fazer. Provavelmente menos horas de docência considerando o esforço e turmas com maior dimensão. (The online teaching format is more difficult for the lecturer than the face-to-face format. The preparation and logistics of online lectures are greater than for in-person lectures, as well as the treatment that is necessary. Probably less teaching hours considering the effort and larger lecture sizes. *)−12
Impacto negativo. Mais exigente para o docente na preparação das matérias. (Negative impact. More demanding for the lecturer in the preparation of the subjects. *)−21
* The answers related to the impact on lecturers’ careers were translated to English to allow better comprehension.
Table 11. Specific indirect effects.
Table 11. Specific indirect effects.
Causal Relations Coefficient Analyses
TEX -> OTA -> OTS−0.027
OTA -> POA -> OTS0.110
TEX -> OTA -> POA -> OTS0.018
TEX -> POA -> OTS0.167
TEX -> OTA -> POA0.045
Table 12. Hypothesis results.
Table 12. Hypothesis results.
HypothesisPath CoefficientsResultsEffect
H1: OTA positively affects OTS−0.169Not confirmed -
H2: OTA positively affects POA0.282ConfirmedModerate
H3: POA positively affects OTS0.380ConfirmedStrong
H4: TEX positively affects OTS0.368ConfirmedStrong
H5: TEX positively affects OTA0.160Confirmed Moderate
H6: TEX positively affects POA0.427ConfirmedStrong
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Martinho, D.; Sobreiro, P.; Vardasca, R. Teaching Sentiment in Emergency Online Learning—A Conceptual Model. Educ. Sci. 2021, 11, 53. https://doi.org/10.3390/educsci11020053

AMA Style

Martinho D, Sobreiro P, Vardasca R. Teaching Sentiment in Emergency Online Learning—A Conceptual Model. Education Sciences. 2021; 11(2):53. https://doi.org/10.3390/educsci11020053

Chicago/Turabian Style

Martinho, Domingos, Pedro Sobreiro, and Ricardo Vardasca. 2021. "Teaching Sentiment in Emergency Online Learning—A Conceptual Model" Education Sciences 11, no. 2: 53. https://doi.org/10.3390/educsci11020053

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop