Next Article in Journal
Rewilding Lite: Using Traditional Domestic Livestock to Achieve Rewilding Outcomes
Previous Article in Journal
Multi-Criteria Approach for Prioritizing and Managing Public Investment in Urban Spaces. A Case Study in the Triple Frontier
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Public Opinions about Online Learning during COVID-19: A Sentiment Analysis Approach

1
Centre for Educational Technology, Indian Institute of Technology Kharagpur, Kharagpur 721302, India
2
Commonwealth of Learning, Burnaby, BC V5H 4M2, Canada
3
Department of Mining Engineering, Indian Institute of Technology Kharagpur, Kharagpur 721302, India
4
Graduate Institute of Science Education, National Taiwan Normal University, Taipei City 116, Taiwan
*
Authors to whom correspondence should be addressed.
Sustainability 2021, 13(6), 3346; https://doi.org/10.3390/su13063346
Submission received: 16 February 2021 / Revised: 10 March 2021 / Accepted: 15 March 2021 / Published: 18 March 2021

Abstract

:
The aim of this study was to analyze public opinion about online learning during the COVID-19 (Coronavirus Disease 2019) pandemic. A total of 154 articles from online news and blogging websites related to online learning were extracted from Google and DuckDuckGo. The articles were extracted for 45 days, starting from the day the World Health Organization (WHO) declared COVID-19 a worldwide pandemic, 11 March 2020. For this research, we applied the dictionary-based approach of the lexicon-based method to perform sentiment analysis on the articles extracted through web scraping. We calculated the polarity and subjectivity scores of the extracted article using the TextBlob library. The results showed that over 90% of the articles are positive, and the remaining were mildly negative. In general, the blogs were more positive than the newspaper articles; however, the blogs were more opinionated compared to the news articles.

1. Introduction

The unprecedented closure of educational institutions due to COVID-19 resulted in over 90% of students out of school by 30 March 2020 [1]. In the 54 countries of the Commonwealth, an estimated 574 million students were out of school by 15 May 2020 [2]. During this crisis, while several governments responded by using a range of tools (radio, television, printed text, Internet) to continue supporting teaching, online learning emerged as a predominant method of teaching and learning. A global overview of interruption of education due to COVID-19 covering 31 countries showed that such measures taken exacerbated social injustice, inequality, and digital divide [3] and highlighted the need for “a pedagogy of care, affection, and empathy” [4].
Developing high-quality online courses requires skill and resources in addition to the understanding of pedagogical approaches suitable for the online environment. Interestingly, these are in short supply during the pandemic [5]. Daniel [6] mentioned that during the crisis, it is more important to focus on doing rather than fighting for perfection and trying to learn pedagogy or technology overnight. Teachers used many innovative approaches to support students during this period. In the Philippines, they used Facebook groups to build solidarity, formulate strategies, and raise donations to support students [7]. However, in practice, there are concerns about fraud and privacy issues for conducting online examination and teaching of practical courses online have become a challenge [5].
Growing focus on online learning has resulted in exponential growth in online courses, mostly offered through video conference solutions, and has some challenges. There are concerns about the digital divide in many countries [8], and many are not enthusiastic about online courses [5] during the pandemic. However, the EdTech industry has treated the crisis as a business opportunity [9]. Many EdTech companies and educational businesses, and technology philanthropists have invested in taking new or existing products to scale [10]. This has proliferated the commercialization of education, creating a “seller’s market” [11]. Some researchers also see the use of online learning during the COVID-19 as a global EdTech experiment [12,13] that will lead to a better understanding about the efficacy of online learning and the transformative power of digital learning. On the other hand, Zhao [14] recommends reimagination of what, where, and how of learning during COVID-19 not just as a reaction, but as proactive thinking to meet the demands of the 4th Industrial Revolution.
Popular media, especially “mainstream media coverage is an important influence on the governmental agenda” [15] (p. 581). Of late, new media (blogs) has emerged as a platform for expression of public opinion setting popular agenda [16], and social media (micro-blogging) play an important role in setting policy agenda after a catastrophic event [17]. Medhat et al. [18] mentioned that people express their feelings or opinions about a specific topic or product in online news articles and micro-blogs or blogs, and these sources of data still need more in-depth analysis. News articles and blogs are abundant sources of information [19] that can be analyzed to evaluate and strengthen online learning. Considering the rapid proliferation of the online learning and a large number of items that regularly appear in the digital newspapers and blogs, we considered using sentiment analysis (SA), a widely used technique to study people’s opinions [20] to understand the discourse around online learning. Recognizing the importance of the practical urgency and the need to be critical practitioners at the time of the pandemic [21], this study attempted to analyze and understand people’s opinion around the use and effectiveness of online learning, as reported in digital media from 11 March 2020 to 26 April 2020. Therefore, the dataset used in this study is limited but presents a unique approach to study people’s opinions about online learning during the COVID-19 pandemic. The following research questions guided this study:
  • RQ1: Among the extracted articles, were the articles positive, negative, or neutral?
  • RQ2: Were the articles written fact-based or opinion-based?
  • RQ3: Is there a significant difference in people’s sentiments between the news articles and blogs?

2. Related Works

2.1. Web Scraping

Web Scraping is a technique to extract large amounts of data from websites to adapt to various scenarios [22]. Web scrapping is mainly used to extract unstructured data from the websites and transform it into structured data. The extracted data is saved to a local file in the computer or a database in a spreadsheet format for data pre-processing and analytics. The commonly used web scraping techniques are Web-scraping Software, DOM parsing, Computer vision webpage analyzers, HTTP programming, and HTML parsing [23]. Web scraping could easily be performed using some libraries like Beautifulsoup (https://pypi.org/project/beautifulsoup4/) (accessed on 12 May 2020) and Request (https://requests.readthedocs.io/en/master/) (accessed on 12 May 2020) in Python language. This method is cost-effective and efficient as it saves several thousands of hours of traditional copy and paste, gives access to clean and well-processed data without much effort, and provides real-time data with just a handful line of code [24].
Many researchers have been employing web scraping methods in different domains. For example, Haddaway [25] used Web-scraping Software to search and create a database of grey literature like journals, reports, working papers, government documents, white papers, and evaluations, which are otherwise difficult to extract. Herrmann and Hoyden [26] applied web scrapping to extract research papers and articles related to marketing research and marketing science. In another study, Chen et al. [27] employed web scraping to extract users’ post from social media like Facebook, Weibo, Twitter, and Baidu. They scrapped 18,809,276 original posts from 5,613,807 users that would have been difficult using traditional methods.

2.2. Sentiment Analysis

“Sentiment Analysis (SA) or Opinion Mining (OM) is the computational study of people’s opinions, attitudes, and emotions toward an entity” [18] (p. 1093). SA can be performed at three levels: (1) document-level, classify an opinion document as expressing a positive or negative opinion or sentiment, (2) sentence-level, classify sentiment expressed in each sentence, and (3) aspect-level, classify the sentiment with respect to the specific aspects of entities [28]. SA can be further categorized based on techniques used: the lexicon-based approach uses sentiment lexicon, which is a collection of known and precompiled sentiment terms; the machine-learning approach uses linguistic features and applies machine learning algorithms [18].
SA has been employed in different domains like product reviews [20], movie reviews [29], hotel reviews [30], news [31,32], consumers review [28], political debates [33], and social media posts (e.g., Facebook, Twitter, Weibo) [34,35]. One of the earliest works on sentimental analysis was conducted by [36], where the researchers performed sentiment analysis of movie reviews. The results showed that machine learning techniques perform better than the manual method. Dave et al. [37] presented an approach to mine the opinion hidden in a given piece of text. Here, the opinions of products were mined from the Web and analyzed using NLP techniques. Opinions were then divided into positive and negative sentiments by the algorithm, while feature opinions and context were taken into consideration. Jiang et al. [38] used SA to assess public opinion about a large infrastructure project, Three Gorges Dam. SA transformed textual data collected from people’s posts on social media (Weibo) into emotional dimensions. In a study by Duan et al. [30], SA was employed to analyze user reviews to measure hotel service quality. Chang et al. [39] deployed SA to examine the influence of user comments, the number of views, and the number of likes on online video popularity. In another study, Lee et al. [28] applied a lexicon-based approach to explore customers’ experience and satisfaction.
Recently, SA has received much attention from the education researchers. For example, Tseng et al. [40] used text sentiment analysis kit SnowNLP to evaluate 20,000 textual opinions obtained from teaching evaluation questionnaires for selecting outstanding teaching faculty. The results revealed that the text sentiment analysis system designed by the researchers could predict 97% positive sentiment and 87% negative sentiment. Hew et al. [41] adopted the SA approach to examine course features (e.g., course structure, course content, course instructor) of 249 MOOCs. They found that course instructor, content, assessment, and schedule significantly predicted student satisfaction. However, course major, duration, perceived workload, and perceived difficulty played no significant roles. In a recent study by [42], the aspect-based analysis was employed to analyze 105K students’ reviews extracted from Coursera. The results revealed that the SA approach provided more accurate results as compared to the expensive manual approach. It is clear that SA is one of the most efficient techniques that can be used to assess people’s views and opinions about a topic.

3. Methodology

We conducted SA in a series of steps. The first step involved the collection of data from different search engines using web scraping. Next, the collected data were refined for applying the models. Finally, the refined data were analyzed through the SA model (see Figure 1).

3.1. Dataset and Pre-Processing

In this study, we selected two widely used search engines, Google (https://www.google.com/) (accessed on 11 March 2020). And DuckDuckGo (https://duckduckgo.com/) (accessed on 11 March 2020). For Google, we extracted the required online news articles and blogs using the Advanced Search Option offered by Google Search Engine after clearing cache and removing location settings to avoid algorithm’s impact on the results. We extracted the required dataset for 45 days starting from 11 March 2020, when the World Health Organization (WHO) declared Coronavirus disease a pandemic. We searched Google using the syntax combination: “COVID-19” OR “Coronavirus” OR “pandemic” “(“online learning”|“remote learning”|“distance learning”|“blended learning”|“e-learning”|“Technology-enabled learning”)” when: 45 days (see Figure 2).
After the search results appeared, the HTML attributes were studied and, accordingly, the web scraping algorithm, Beautifulsoup, was applied. We followed the similar approach to search the dataset using DuckDuckGo. We selected only 154 online news articles or blogs that were present in both search engines. The articles were compiled in a dataset. The dataset was in the form of an excel sheet where the columns were: title of the article, the content of the article, URL of the article, and whether the article was online news or a blog. The dataset was then imported as a “Python Data frame.”
The next step involved pre-processing to remove unwanted noise from the textual data using the following steps:
(a)
Convert all the texts into lowercase.
(b)
Eliminate all the URLs.
(c)
Removing all the hashtags, user mentions, and emoji.
(d)
Removing all the punctuations.
(e)
Removing all the numbers.
(f)
Removing parenthesis.

3.2. Sentiment Analysis

We conducted SA using the dictionary-based approach of the lexicon-based method. To analyze the people’s opinions, we studied the polarity and subjectivity of articles. Polarity determines whether a text expresses a positive or negative or neutral opinion. It ranges between −1 to +1, where −1 is extremely negative sentiment and +1 is extremely positive sentiment. On the other hand, subjectivity classifies a text as a fact or opinion. It ranges between 0 and +1, where 0 indicates very objective, and +1 indicates very subjective (Yaqub et al., 2018). We employed TextBlob (http://textblob.readthedocs.org/en/dev/) (accessed on 12 May 2020), a framework developed by Loria [43], which has been widely used by the researchers [44,45] to conduct SA. TextBlob is a Python (2 and 3) library for processing textual data. Micu et al. [46] employed TextBlob to analyze customers’ liking, rating and reviewing restaurants. They found that TextBlob is an effective tool sentiment analysis tool. Similarly, Hasan et al. [47] also advocated that TextBlob is one of the best tools to calculate polarity and subjectivity scores for textual data. In a recent study by [48], TextBlob was used to conduct the sentiment analysis of patients’ feelings suffered from a rare disease.
In the lexicon-based approach, a sentiment lexicon is constructed using appropriate sentimental words, degree adverbs, and negative words, and the sentimental intensity and sentimental polarity [49]. A sentiment lexicon can be used to differentiate between objective facts and subjective opinions in a text. Each lexicon is given a score of polarity and subjectivity based on the context, the form of the word, and position in the sentence, e.g., adverb, adjective. For example, for the word “great” in a text, the following encoding is done (see Table 1).
Therefore, the output for the ‘great’ is given according to the context in which the word is used. However, if only the single word “great” is called with no context, the model gives the output as the average of all “great” word lexicons. For the negation of the lexicon, the model multiplies the result of polarity with (−0.5) and does not affect subjectivity or the intensity. If the modifier words like “very” are added to the word “great.” The model makes changes according to the following rules:
  • New Polarity = (Initial Polarity of the word ‘great’) * (intensity of the word ‘very’)
  • New Subjectivity = (Initial Subjectivity of the word ‘great’) * (intensity of the word ‘very’)
Therefore, according to the above rules, the TextBlob gives scores on polarity and subjectivity on every identifiable lexicon in the provided phrase and ignores the words which are not in the document (e.g., proper nouns). After that, it averages the scores for every value of lexicon in the provided article. We applied the TextBlob library to our dataset to calculate the polarity and subjectivity of each article.

4. Results and Discussion

4.1. Research Question 1

Figure 3 shows that the polarity of the articles ranges from around −0.05 to 0.3 (see bar graph above the polarity axis). This implies that the articles written during this time have a polarity around neutral towards a somewhat positive side. The results showed that around 92.2% of the articles (142 out of 154 articles) are positive in nature i.e., their polarity scores are greater than 0. A possible reason behind this result is that educators and policymakers worldwide are advocating the potential advantages of online learning in this pandemic situation [50,51]. This motivates the instructors and students to adopt online learning in their formal education. There are few articles that have polarity scores are below the number 0, and even though they are on a negative scale, they are too close to the number 0. The possible reason could be the digital divide. People from rural places have less access to the Internet and other digital resources required for online learning [52]. This might have created anxiousness among them. Another reason could be that online learning may be new for a larger section of the people. Therefore, people are slowly adjusting to the online learning environment as they are accustomed to the traditional teaching and learning process. The people’s opinion about online learning is positive, but the positivity depicted is not too high, showing doubts regarding the usefulness and effectiveness of online learning. Therefore, notwithstanding the digital divide, there is much scope in highlighting the benefits of online learning in media for mainstreaming online learning.

4.2. Research Question 2

The subjectivity of the articles ranges from 0.3 to 0.6. From this result, it can be implied that most of the articles are written more on the factual side as the score 0 is assigned to the factual articles, and score +1 is assigned to the opinionated articles by TextBlob. We found that around 85.71% (132 articles out of 154 articles) of the articles were more factual in nature.
In our dataset, we have 82 blogs and the remaining 72 were newspaper articles. Blogs are generally more opinionated compared to the news articles. Due to a larger proportion of blogs in the dataset, the subjectivity of the dataset could have been shifted from the strictly factual. Therefore, we calculated the mean value for the subjectivity scores (see Table 2). The results showed that mean value of the subjectivity scores of the articles in around 0.43; the mean is towards the factual.

4.3. Research Question 3

The sentiment analysis was conducted for both the news articles and blogs separately. An independent sample t-test was employed to compare the subjectivity and polarity scores between news articles and blogs. There was a significant difference between mean subjectivity scores of the news articles (M = 0.42, SD = 0.06) compared to the blogs (M = 0.44, SD = 0.05); [t (152) = 2.39, p < 0.05] (see Table 3). The calculated effect size (Cohen’s d) is 0.42, which is considered a large effect (Cohen, 1988). This result indicated that people’s opinions in blogs are more opinionated as compared to the news articles. These results are consistent with the finding of Ku et al. (2006) in which they found that the nature of news articles are different from blogs. A possible reason is that the writing style differs in both news articles and blogs. Generally, news articles are more fact-based, formal, and lengthier whereas, in blogs, people express their personal opinions in an informal and shorter way. We also observed that most of the news articles provided statistics to support the argument. For example:
The coronavirus pandemic continues to cast a pall over the global stock market. Per a report by Fitch, the world economy is expected to decline 1.9% in 2020, with the U.S., eurozone and U.K. GDP declining 3.3%, 4.2% and 3.9%, respectively………. In this regard, per ResearchandMarkets.com, the global e-learning market is projected to reach a worth of $238 billion by 2024, at a CAR of 8.5% during 2019–2024…………. The stock has lost 17.4% compared with the index’s 18.3% decline……………. The stock has gained 2.3% against the industry’s 5.6% decline…………. In the process, it’s expected to create 22 million jobs and generate $12.3 trillion in activity. ……… special report reveals 8 stocks to watch.
While we had talked about embracing technology-enhanced learning at the Singapore Institute of Technology, ……………. At the back of our minds, we were also ready for full e-learning to take place if the situation worsened, but the reality was different. The panic occurred five weeks ago and thankfully the online learning experience has generally been positive. …………. Licences or access to technology such as Zoom, Respondus and Microsoft Teams was made available to all teaching staff and students………. It also allowed training and guides to be provided based on the key tools that were available. ……………A policy was written up to introduce flexibility with faster approval routes for changes that were needed……………. The feeling that “we are in this together”, or “if I make a mistake in the online learning platform, I am not the only one”, helped to support everyone in the journey……………we are also discovering the joys of online teaching and learning, and improving pain points along the way.
There was a significant difference between mean polarity scores of the news articles (M = 0.11, SD = 0.06) compared to the blogs (M = 0.13, SD = 0.05); [t (152) = 2.61, p < 0.05] (see Table 4). The calculated effect size (Cohen’s d) is 0.42, which is considered a large effect [53]. Figure 4 shows the polarity pattern for the news and blogs. The histogram for blogs is shifted towards the right than the news articles. This result indicated that people’s opinion in blogs are more positive as compared to the news articles. In general, we can conclude that people have positive sentiments towards online learning.
Most blogs have negative opinions about the commercialization of online education by EdTech industries and the telecommunication companies’ exploitation for overcharging the data cost. This is consistent with Williamson and Hogan [10], where they highlighted the concerns about the commercialization of the education that will further increase the digital divide. In addition, people expressed their concerns about access to digital devices, Internet connectivity, access to learning resources with different types of digital devices, and poor user experience with technology. Some blogs also expressed concern about the adverse effects of online learning for kindergarten students. However, most blogs appreciate that online learning ensures the continuity of the education in this pandemic situation by providing flexibility to access the contents. Some blogs mentioned that collaboration among the teachers would increase as the learning materials will be curated and widely used. This pandemic situation has provided the opportunity for educators and policy makers across the world to work together to support the teachers by developing innovative teaching and learning methodologies to make learning more interactive and effective.
On the other hand, the news articles highlighted critical issues and challenges such as students’ privacy, low and unstable Wi-Fi connections at home, cybersecurity, and time zone differences. Some news articles mentioned that the teachers are anxious and stressed about online learning as they need to come out from their comfort zone of traditional teaching and learning process. Instructors lack proper training to use technologies for online learning. News articles also expressed concern over the lack of digital resources and Internet issues. For example, in one news article, it is mentioned that around 35% of the students are not able to participate in the digital learning in South Africa because of the lack of resources. This is widening the equity gap. News articles mentioned that some companies like Microsoft, Google, have come forward to help instructors and students by providing free resources, courses, and training to facilitate remote learning. Some universities have adopted online learning not only for continuing formal teaching but also for other academic activities like virtual international conferences, internships, teacher training, job placement interviews, and hackathon.
While the COVID-19 pandemic has created a new opportunity for mainstreaming of online learning, public opinion will shape its actual adoption in practice by governments and educational institutions. Previous experience in climate change has shown the influences of media on practices, politics, and public opinion and understanding related to climate change [54]. Another study indicated that the more negative the media coverage and the more local this coverage, the greater the impact of discipling corporate pollutions [55]. Recently, a group of scholars from South Africa have highlighted the issues of equity and inequality in the “pivot” to remote teaching and learning [56]. Online media (both newspapers and blogs) will play a critical role in not only identifying the challenges but also influencing positive changes to assist the adoption of online learning worldwide, particularly in low- and middle-income countries.

5. Conclusions

In this study, we performed sentiment analysis to investigate public opinion about online learning. We analyzed the online news and blogs in the early days of the pandemic. Applying SA, along with web-scraping, is a new attempt to obtain insights into public opinion towards online learning. This study revealed the positive but cautious perceptions about online learning in public digital media with low polarity value. While this is a good starting point for an area that is changing fast, it also calls for more widespread sharing of the researches about online learning in public media to create strong public opinion and drive public policy in many countries to adopt online and blended learning as a means to build a resilient system. The overall low subjectivity scores support that an evidence-based approach to create public policy discourse is possible through digital media, even though the blogs in this study had higher subjectivity than news items.

6. Limitations and Recommendations

This study has limitations of a small data set and used early reports. However, as the pandemic continues and people become more experienced in using online learning in various innovative and unimagined ways, more critical and negative news may appear, as already being highlighted by Czerniewicz et al. [56]. Such deliberations would help improvements in the systems deployed and help mainstreaming of online learning. Therefore, in the future study, we plan to validate our approach on larger datasets. In addition, we will apply content analysis to gain further insights into the contents of news and blogs. This study can be replicated regularly for understanding the overall sentiment about any topic, especially for public policy related to education.

Author Contributions

Conceptualization, K.K.B. and S.M.; methodology, K.K.B., S.M., A.D. and C.-Y.C.; validation, K.K.B., S.M., A.D. and C.-Y.C.; data curation, K.K.B., S.M. and A.D.; writing—original draft preparation, K.K.B., S.M. and A.D.; writing—review and editing, K.K.B., S.M. and C.-Y.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. UNESCO. COVID-19 Impact on Education; UNESCO: Paris, France, 2020; Available online: https://en.unesco.org/covid19/educationresponse (accessed on 11 September 2020).
  2. Kanwar, A.; Daniel, J. Report to Commonwealth Education Ministers: From Response to Resilience; Commonwelath of Learning: Burnaby, BC, Canada, 2020. [Google Scholar]
  3. World Bank. World Development Report 2016: Digital Dividends; World Bank Publications: Washington, DC, USA, 2016; Available online: https://www.worldbank.org/en/publication/wdr2016 (accessed on 12 March 2020).
  4. Bozkurt, A.; Jung, I.; Xiao, J.; Vladimirschi, V.; Schuwer, R.; Egorov, G.; Lambert, S.; Al-Freih, M.; Pete, J.; Olcott, D., Jr.; et al. A global outlook to the interruption of education due to COVID-19 pandemic: Navigating in a time of uncertainty and crisis. Asian J. Distance Educ. 2020, 15, 1–126. [Google Scholar]
  5. Altbach, P.G.; de Wit, H. Responding to COVID-19 with IT: A Transformative Moment? Int. High. Educ. 2020, 103, 3–5. [Google Scholar]
  6. Daniel, S.J. Education and the COVID-19 pandemic. Prospects 2020, 49, 91–96. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Toquero, C.M. Emergency remote teaching amid COVID-19: The turning point. Asian J. Distance Educ. 2020, 15, 185–188. [Google Scholar]
  8. Lynch, M. E-Learning during a global pandemic. Asian J. Distance Educ. 2020, 15, 189–195. [Google Scholar] [CrossRef]
  9. Williamson, B.; Eynon, R.; Potter, J. Pandemic politics, pedagogies and practices: Digital technologies and distance education during the coronavirus emergency. Learn. Media Technol. 2020, 45, 107–114. [Google Scholar] [CrossRef]
  10. Williamson, B.; Hogan, A. Commercialisation and Privatisation in/of Education in the Context of COVID-19; Education International Research: Birmingham, UK, 2020. [Google Scholar]
  11. Teräs, M.; Suoranta, J.; Teräs, H.; Curcher, M. Post-Covid-19 education and education technology ‘solutionism’: A Seller’s Market. Postdigital Sci. Educ. 2020, 2, 863–878. [Google Scholar] [CrossRef]
  12. Anderson, J. The Coronavirus Pandemic is Reshaping Education. 2020. Available online: https://qz.com/1826369/how-coronavirus-is-changing-education/ (accessed on 14 April 2020).
  13. Zimmerman, J. Coronavirus and the great online-learning experiment: Let’s determine what our students actually learn online. Chronicle, 10 March 2020. Available online: https://www.chronicle.com/article/Coronavirusthe-Great/248216 (accessed on 11 March 2020).
  14. Zhao, Y. COVID-19 as a catalyst for educational change. Prospects 2020, 49, 29–33. [Google Scholar] [CrossRef]
  15. Wallsten, K. Agenda setting and the blogosphere: An analysis of the relationship between mainstream media and political blogs. Rev. Policy Res. 2007, 24, 567–587. [Google Scholar] [CrossRef]
  16. Aruguete, N. The agenda setting hypothesis in the new media environment. Comun. Soc. 2017, 28, 35–58. [Google Scholar] [CrossRef]
  17. Wu, Y.; Atkin, D.; Lau, T.Y.; Lin, C.; Mou, Y. Agenda setting and micro-blog use: An analysis of the relationship between Sina Weibo and newspaper agendas in China. J. Soc. Media Soc. 2013, 8, 53–62. [Google Scholar]
  18. Medhat, W.; Hassan, A.; Korashy, H. Sentiment analysis algorithms and applications: A survey. Ain Shams Eng. J. 2014, 5, 1093–1113. [Google Scholar] [CrossRef] [Green Version]
  19. Ku, L.-W.; Liang, Y.-T.; Chen, H.-H. Opinion Extraction, Summarization and Tracking in News and Blog Corpora. 2006. Available online: http://www.aaai.org/Library/Symposia/Spring/2006/ss06-03-020.php (accessed on 15 March 2020).
  20. Ng, C.; Law, K.M. Investigating consumer preferences on product designs by analyzing opinions from social networks using evidential reasoning. Comput. Ind. Eng. 2020, 139, 106180. [Google Scholar] [CrossRef]
  21. Hage, G. The haunting figure of the useless academic: Critical thinking in coronavirus time. Eur. J. Cult. Stud. 2020, 23, 662–666. [Google Scholar] [CrossRef]
  22. Mooney, S.J.; Westreich, D.J.; El-Sayed, A.M. Commentary. Epidemiology 2015, 26, 390–394. [Google Scholar] [CrossRef] [PubMed]
  23. Saurkar, A.V.; Pathare, K.G.; Gode, S.A. An overview on web scraping techniques and tools. Int. J. Future Revolut. Comput. Sci. Commun. Eng. 2018, 4, 363–367. [Google Scholar]
  24. Hillen, J. Web scraping for food price research. Br. Food J. 2019, 121, 3350–3361. [Google Scholar] [CrossRef]
  25. Haddaway, N.R. The use of web-scraping software in searching for grey literature. Grey J. 2015, 11, 186–190. [Google Scholar]
  26. Herrmann, M.; Hoyden, L. Applied webscraping in market research. In Proceedings of the 1st International Conference on Advanced Research Methods and Analytics, Valencia, Spain, 6–7 July 2016; Universitat Politecnica de Valencia: Valencia, Spain, 2016. [Google Scholar]
  27. Chen, Z.; Zhang, R.; Xu, T.; Yang, Y.; Wang, J.; Feng, T. Emotional attitudes towards procrastination in people: A large-scale sentiment-focused crawling analysis. Comput. Hum. Behav. 2020, 110, 106391. [Google Scholar] [CrossRef]
  28. Lee, S.-W.; Jiang, G.; Kong, H.-Y.; Liu, C. A difference of multimedia consumer’s rating and review through sentiment analysis. Multimed. Tools Appl. 2020. [Google Scholar] [CrossRef]
  29. Bai, X. Predicting consumer sentiments from online text. Decis. Support Syst. 2011, 50, 732–742. [Google Scholar] [CrossRef]
  30. Duan, W.; Yu, Y.; Cao, Q.; Levy, S. Exploring the impact of social media on hotel service performance. Cornell Hosp. Q. 2015, 57, 282–296. [Google Scholar] [CrossRef] [Green Version]
  31. Godbole, N.; Srinivasaiah, M.; Skiena, S. Large-scale sentiment analysis for news and blogs. In Proceedings of the International Conference on Weblogs and Social Media, ICWSM’07, Boulder, CO, USA, 26–28 March 2007. [Google Scholar]
  32. Moreo, A.; Romero, M.; Castro, J.; Zurita, J. Lexicon-based Comments-oriented News Sentiment Analyzer system. Expert Syst. Appl. 2012, 39, 9166–9180. [Google Scholar] [CrossRef]
  33. Balaguer, P.; Teixidó, I.; Vilaplana, J.; Mateo, J.; Rius, J.; Solsona, F. CatSent: A Catalan sentiment analysis website. Multimed. Tools Appl. 2019, 78, 28137–28155. [Google Scholar] [CrossRef]
  34. Li, S.; Wang, Y.; Xue, J.; Zhao, N.; Zhu, T. The impact of COVID-19 epidemic declaration on psychological consequences: A study on active weibo users. Int. J. Environ. Res. Public Health 2020, 17, 2032. [Google Scholar] [CrossRef] [Green Version]
  35. Pandey, A.C.; Rajpoot, D.S.; Saraswat, M. Twitter sentiment analysis using hybrid cuckoo search method. Inf. Process. Manag. 2017, 53, 764–779. [Google Scholar] [CrossRef]
  36. Pang, B.; Lee, L.; Vaithyanathan, S. Thumbs up? Sentiment classification using machine learning techniques. In Proceedings of the 2002 Conference on Empirical Methods of Natural Language Processing (EMNLP’02), Philadelphia, PA, USA, 6–7 July 2002. [Google Scholar]
  37. Dave, K.; Lawrence, S.; Pennock, D.M. Mining the peanut gallery: Opinion extraction and semantic classification of product reviews. In Proceedings of the 12th International Conference on World Wide Web, Budapest, Hungary, 20–24 May 2003. [Google Scholar]
  38. Jiang, H.; Qiang, M.; Lin, P. Assessment of online public opinions on large infrastructure projects: A case study of the Three Gorges Project in China. Environ. Impact Assess. Rev. 2016, 61, 38–51. [Google Scholar] [CrossRef]
  39. Chang, W.-L.; Chen, L.-M.; Verkholantsev, A. Revisiting online video popularity: A sentimental analysis. Cybern. Syst. 2019, 50, 563–577. [Google Scholar] [CrossRef]
  40. Tseng, C.-W.; Chou, J.-J.; Tsai, Y.-C. Text mining analysis of teaching evaluation questionnaires for the selection of outstanding teaching faculty members. IEEE Access 2018, 6, 72870–72879. [Google Scholar] [CrossRef]
  41. Hew, K.F.; Hu, X.; Qiao, C.; Tang, Y. What predicts student satisfaction with MOOCs: A gradient boosting trees supervised machine learning and sentiment analysis approach. Comput. Educ. 2020, 145, 103724. [Google Scholar] [CrossRef]
  42. Kastrati, Z.; Imran, A.S.; Kurti, A. Weakly supervised framework for aspect-based sentiment analysis on students’ reviews of MOOCs. IEEE Access 2020, 8, 106799–106810. [Google Scholar] [CrossRef]
  43. Loria, S. Textblob Documentation; TextBlob: Brooklyn, NY, USA, 2018. [Google Scholar]
  44. Onyenwe, I.; Nwagbo, S.; Mbeledogu, N.; Onyedinma, E. The impact of political party/candidate on the election results from a sentiment analysis perspective using #AnambraDecides2017 tweets. Soc. Netw. Anal. Min. 2020, 10, 1–17. [Google Scholar] [CrossRef]
  45. Yaqub, U.; Sharma, N.; Pabreja, R.; Chun, S.A.; Atluri, V.; Vaidya, J. Analysis and visualization of subjectivity and polarity of Twitter location data. In Proceedings of the 19th Annual International Conference on Digital Government Research: Governance in the Data Age, Delf, The Netherlands, 30 May 2018–1 June 2018; ACM: New York, NY, USA, 2018; p. 67. [Google Scholar]
  46. Micu, A.; Micu, A.E.; Geru, M.; Lixandroiu, R.C. Analyzing user sentiment in social media: Implications for online marketing strategy. Psychol. Mark. 2017, 34, 1094–1100. [Google Scholar] [CrossRef]
  47. Hasan, A.; Moin, S.; Karim, A.; Shamshirband, S. Machine learning-based sentiment analysis for twitter accounts. Math. Comput. Appl. 2018, 23, 11. [Google Scholar] [CrossRef] [Green Version]
  48. Subirats, L.; Conesa, J.; Armayones, M. Biomedical holistic ontology for people with rare diseases. Int. J. Environ. Res. Public Health 2020, 17, 6038. [Google Scholar] [CrossRef]
  49. Yang, L.; Li, Y.; Wang, J.; Sherratt, R.S. Sentiment ANALYSIS FOR E-commerce product reviews in Chinese based on sentiment lexicon and deep learning. IEEE Access 2020, 8, 23522–23530. [Google Scholar] [CrossRef]
  50. Commonwealth of Learning. Guidelines on Distance Education during COVID-19; Commonwealth of Learning: Burnaby, BC, Canada, 2020; Available online: http://oasis.col.org/handle/11599/3576 (accessed on 20 March 2020).
  51. McBurnie, C. The Use of Virtual Learning Environments and Learning Management Systems during the COVID-19 Pandemic; Zenodo: Geneva, Switzerland, 2020. [Google Scholar] [CrossRef]
  52. Mahajan, S. Technological, social, pedagogical issues must be resolved for online teaching. Indian Express, 29 April 2020. Available online: https://indianexpress.com/article/opinion/columns/india-coronavirus-lockdown-online-educationlearning-6383692/ (accessed on 20 March 2020).
  53. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Routledge: New York, NY, USA, 1988. [Google Scholar]
  54. Boykoff, M.T.; Roberts, J.T. Media Coverage of Climate Change: Current Trends, Strengths, Weaknesses. Human Development Report; United Nations: New York, NY, USA, 2017; Available online: http://hdr.undp.org/sites/default/files/boykoff_maxwell_and_roberts_j._timmons.pdf (accessed on 30 March 2020).
  55. Jia, M.; Tong, L.; Viswanath, P.V.; Zhang, Z. Word power: The impact of negative media coverage on disciplining corporate pollution. J. Bus. Ethics 2016, 138, 437–458. [Google Scholar] [CrossRef]
  56. Czerniewicz, L.; Agherdien, N.; Badenhorst, J.; Belluigi, D.; Chambers, T.; Chili, M.; De Villiers, M.; Felix, A.; Gachago, D.; Gokhale, C.; et al. A wake-up call: Equity, inequality and Covid-19 emergency remote teaching and learning. Postdigital Sci. Educ. 2020, 2, 946–967. [Google Scholar] [CrossRef]
Figure 1. An overview of sentiment analysis approach.
Figure 1. An overview of sentiment analysis approach.
Sustainability 13 03346 g001
Figure 2. Screening process and stages.
Figure 2. Screening process and stages.
Sustainability 13 03346 g002
Figure 3. Visualisation of the polarity and subjectivity Scores.
Figure 3. Visualisation of the polarity and subjectivity Scores.
Sustainability 13 03346 g003
Figure 4. Polarity comparison of news articles and blogs.
Figure 4. Polarity comparison of news articles and blogs.
Sustainability 13 03346 g004
Table 1. Encoding of the word ‘Great’.
Table 1. Encoding of the word ‘Great’.
Word formPositionSensePolaritySubjectivityIntensityExample
GreatAdjective“very good”1.01.01.0“Owing to his exceptional practice, he is becoming great at Tennis.”
GreatAdjective“of major significance or importance”1.01.01.0“The low cost of these products gives them great appeal.”
GreatAdjective“remarkable or out of the ordinary in degree or magnitude or effect”0.80.81.0“She is an actress of Great Charm.”
GreatAdjective“relatively large in size or number or extent”0.40.21.0“All creatures great and small.”
Table 2. Descriptive statistics for polarity and subjectivity scores of the articles (N = 154).
Table 2. Descriptive statistics for polarity and subjectivity scores of the articles (N = 154).
MeanSD
Polarity0.120.05
Subjectivity0.430.05
Table 3. Independent sample t-test for the subjectivity scores.
Table 3. Independent sample t-test for the subjectivity scores.
GroupNMeanSDt-Value
News articles720.420.062.39 *
Blogs820.440.05
* p < 0.05.
Table 4. Independent sample t-test for the polarity scores.
Table 4. Independent sample t-test for the polarity scores.
GroupNMeanSDt-Value
News articles720.110.062.61 *
Blogs820.130.05
* p < 0.05.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Bhagat, K.K.; Mishra, S.; Dixit, A.; Chang, C.-Y. Public Opinions about Online Learning during COVID-19: A Sentiment Analysis Approach. Sustainability 2021, 13, 3346. https://doi.org/10.3390/su13063346

AMA Style

Bhagat KK, Mishra S, Dixit A, Chang C-Y. Public Opinions about Online Learning during COVID-19: A Sentiment Analysis Approach. Sustainability. 2021; 13(6):3346. https://doi.org/10.3390/su13063346

Chicago/Turabian Style

Bhagat, Kaushal Kumar, Sanjaya Mishra, Alakh Dixit, and Chun-Yen Chang. 2021. "Public Opinions about Online Learning during COVID-19: A Sentiment Analysis Approach" Sustainability 13, no. 6: 3346. https://doi.org/10.3390/su13063346

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop