A Testset-Based Method to Analyse the Negation-Detection Performance of Lexicon-Based Sentiment Analysis Tools
Abstract
:1. Introduction
2. A Theoretical Background of Negations
3. Corpus-Based vs. Testset-Based Approach
3.1. The Corpus-Based Approach
3.2. The Testset-Based Approach
4. The Testset
5. Libraries under Test
- syuzhet;
- sentimentR;
- SentimentAnalysis;
- RSentiment [55];
- meanr;
- VADER (Valence Aware Dictionary for Sentiment Reasoning).
6. Performance Evaluation Metrics
- Accuracy;
- Sensitivity;
- Specificity;
- Precision;
- F-score.
7. Results
8. Discussion and Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Sample Availability
Abbreviations
DOAJ | Directory of open access journals |
FN | False Negative |
FP | False Positive |
MDPI | Multidisciplinary Digital Publishing Institute |
PoS | Parts of Speech |
SFU | Simon Fraser University |
SOCC | SFU Opinion and Comments Corpus |
TN | True Negative |
TP | True Positive |
UK | United Kingdom |
VADER | Valence Aware Dictionary for sEntiment Reasoning |
Appendix A. Sentence Testset
Sentence | PoS Composition |
---|---|
This device is perfect, isn’t it | DT NN VB (POS)JJ VB RB PRP |
This device is not perfect | DT NN VB (NEG)RB (POS)JJ |
This device isn’t perfect | DT NN VB (NEG)RB (POS)JJ |
This device is not perfect! | DT NN VB (NEG)RB (POS)JJ. |
This device is not perfect at all | DT NN VB (NEG)RB (POS)JJ IN DT |
This device is not perfect, is it? | DT NN VB (NEG)RB (POS)JJ, VB PRP. |
This device is not perfect and useless | DT NN VB (NEG)RB (POS)JJ CC (NEG)JJ |
This device is not perfect and almost useless | DT NN VB (NEG)RB (POS)JJ CC RB (NEG)JJ |
This device is useless and not perfect | DT NN VB (NEG)JJ CC (NEG)RB (POS)JJ |
She is unreliable | PRP VB (NEG)JJ |
This device is neither perfect nor useful | DT NN VB RB JJ CC JJ |
This device is not either perfect or useful | DT NN VB RB RB JJ CC JJ |
This device is not perfect, but very useful | DT NN VB RB JJ, CC RB JJ |
She is not reliable | PRP VB RB JJ |
This device is not perfect, but useless | DT NN VB RB JJ, CC JJ |
This device is not perfect and useless | DT NN VB RB JJ, CC JJ |
This device is not perfect and hardly useless | DT NN VB RB JJ, CC RB JJ |
This device is useless and not perfect | DT NN VB JJ, CC RB JJ |
She is not unreliable | PRP VB RB JJ |
This device is not the most perfect yet it is very useful | DT NN VB RB DT RBS JJ CC PRP VB RB JJ |
I do not think this device is perfect | PRP VBP RB VB DT NN VB JJ |
I do not think this device could be perfect | PRP VBP RB VB DT NN MD VB JJ |
This device has never been perfect | DT NN VB RB VBN JJ |
This device has never been imperfect | DT NN VB RB VBN JJ |
This device has never been perfect and useful | DT NN VB RB VBN JJ CC JJ |
This device has never been perfect but useful | DT NN VB RB VBN JJ CC JJ |
This device has never been either perfect or useful | DT NN VB RB VBN RB JJ CC JJ |
I do not like it at all | PRP VBP RB VB PRP IN DT |
I don’t like it | PRP VBP RB VB PRP |
I dislike anything | PRP VBP NN |
I like nothing | PRP VBP NN |
Nobody likes it | NN VB PRP |
References
- Feldman, R. Techniques and applications for sentiment analysis. Commun. ACM 2013, 56, 82–89. [Google Scholar] [CrossRef]
- Medhat, W.; Hassan, A.; Korashy, H. Sentiment analysis algorithms and applications: A survey. Ain Shams Eng. J. 2014, 5, 1093–1113. [Google Scholar] [CrossRef] [Green Version]
- Joachims, T. Making large-scale support vector machine learning practical. In Advances in Kernel Methods: Support Vector Learning; MIT Press: Cambridge, MA, USA, 1999; pp. 169–184. [Google Scholar]
- Turney, P.D. Thumbs up or thumbs down? Semantic orientation applied to unsupervised classification of reviews. arXiv 2002, arXiv:cs/0212032. [Google Scholar]
- Prabowo, R.; Thelwall, M. Sentiment analysis: A combined approach. J. Inf. 2009, 3, 143–157. [Google Scholar] [CrossRef]
- Zhang, L.; Wang, S.; Liu, B. Deep learning for sentiment analysis: A survey. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2018, 8, e1253. [Google Scholar] [CrossRef] [Green Version]
- Wang, J.; Xu, B.; Zu, Y. Deep learning for aspect-based sentiment analysis. In Proceedings of the 2021 International Conference on Machine Learning and Intelligent Systems Engineering (MLISE), Chongqing, China, 9–11 July 2021; pp. 267–271. [Google Scholar]
- Kratzwald, B.; Ilić, S.; Kraus, M.; Feuerriegel, S.; Prendinger, H. Deep learning for affective computing: Text-based emotion recognition in decision support. Decis. Support Syst. 2018, 115, 24–35. [Google Scholar] [CrossRef] [Green Version]
- Cambria, E. Affective Computing and Sentiment Analysis. IEEE Intell. Syst. 2016, 31, 102–107. [Google Scholar] [CrossRef]
- Cambria, E.; Das, D.; Bandyopadhyay, S.; Feraco, A. Affective computing and sentiment analysis. In A Practical Guide to Sentiment Analysis; Springer: Berlin/Heidelberg, Germany, 2017; pp. 1–10. [Google Scholar]
- Guo, F.; Li, F.; Lv, W.; Liu, L.; Duffy, V.G. Bibliometric analysis of affective computing researches during 1999∼ 2018. Int. J. Hum.- Interact. 2020, 36, 801–814. [Google Scholar] [CrossRef]
- Basiri, M.E.; Nemati, S.; Abdar, M.; Cambria, E.; Acharya, U.R. ABCDM: An attention-based bidirectional CNN-RNN deep model for sentiment analysis. Future Gener. Comput. Syst. 2021, 115, 279–294. [Google Scholar] [CrossRef]
- Usama, M.; Ahmad, B.; Song, E.; Hossain, M.S.; Alrashoud, M.; Muhammad, G. Attention-based sentiment analysis using convolutional and recurrent neural network. Future Gener. Comput. Syst. 2020, 113, 571–578. [Google Scholar] [CrossRef]
- Cambria, E.; Li, Y.; Xing, F.Z.; Poria, S.; Kwok, K. SenticNet 6: Ensemble application of symbolic and subsymbolic AI for sentiment analysis. In Proceedings of the 29th ACM International Conference on Information & knowledge Management, Virtual Event, Ireland, 19–23 October 2020; pp. 105–114. [Google Scholar]
- Hussein, D.M.E.D.M. A survey on sentiment analysis challenges. J. King Saud Univ.-Eng. Sci. 2018, 30, 330–338. [Google Scholar] [CrossRef]
- Heerschop, B.; van Iterson, P.; Hogenboom, A.; Frasincar, F.; Kaymak, U. Accounting for negation in sentiment analysis. In Proceedings of the 11th Dutch-Belgian Information Retrieval Workshop (DIR 2011). Citeseer, Amsterdam, The Netherlands, 4 February 2011; pp. 38–39. [Google Scholar]
- Hogenboom, A.; Van Iterson, P.; Heerschop, B.; Frasincar, F.; Kaymak, U. Determining negation scope and strength in sentiment analysis. In Proceedings of the 2011 IEEE International Conference on Systems, Man and Cybernetics, Anchorage, AK, USA, 9–12 October 2011; pp. 2589–2594. [Google Scholar]
- Asmi, A.; Ishaya, T. Negation identification and calculation in sentiment analysis. In Proceedings of the second international conference on advances in information mining and management, Venice, Italy, 21–26 October 2012; pp. 1–7. [Google Scholar]
- Dadvar, M.; Hauff, C.; de Jong, F. Scope of negation detection in sentiment analysis. In Proceedings of the Dutch-Belgian Information Retrieval Workshop (DIR 2011). Citeseer, Amsterdam, The Netherlands, 27–28 January 2011; pp. 16–20. [Google Scholar]
- Jia, L.; Yu, C.; Meng, W. The effect of negation on sentiment analysis and retrieval effectiveness. In Proceedings of the 18th ACM Conference on Information and Knowledge Management, Hong Kong, China, 2–6 November 2009; pp. 1827–1830. [Google Scholar]
- Remus, R. Modeling and Representing Negation in Data-driven Machine Learning-based Sentiment Analysis. In Proceedings of the ESSEM@ AI*IA, Turin, Italy, 3 December 2013; pp. 22–33. [Google Scholar]
- Wiegand, M.; Balahur, A.; Roth, B.; Klakow, D.; Montoyo, A. A survey on the role of negation in sentiment analysis. In Proceedings of the Workshop on Negation and Speculation in Natural Language Processing, Uppsala, Sweden, 10 July 2010; pp. 60–68. [Google Scholar]
- Lapponi, E.; Read, J.; Øvrelid, L. Representing and resolving negation for sentiment analysis. In Proceedings of the 2012 IEEE 12th International Conference on Data Mining Workshops, Brussels, Belgium, 10–13 December 2012; pp. 687–692. [Google Scholar]
- Farooq, U.; Mansoor, H.; Nongaillard, A.; Ouzrout, Y.; Qadir, M.A. Negation Handling in Sentiment Analysis at Sentence Level. JCP 2017, 12, 470–478. [Google Scholar] [CrossRef]
- Barnes, J.; Velldal, E.; Øvrelid, L. Improving sentiment analysis with multi-task learning of negation. arXiv 2019, arXiv:1906.07610. [Google Scholar] [CrossRef]
- Pröllochs, N.; Feuerriegel, S.; Lutz, B.; Neumann, D. Negation scope detection for sentiment analysis: A reinforcement learning framework for replicating human interpretations. Inf. Sci. 2020, 536, 205–221. [Google Scholar] [CrossRef]
- Díaz, N.P.C.; López, M.J.M. Negation and speculation detection; John Benjamins Publishing Company: Amsterdam, The Netherlands, 2019; Volume 13. [Google Scholar]
- Jiménez-Zafra, S.M.; Morante, R.; Teresa Martín-Valdivia, M.; Ureña-López, L.A. Corpora Annotated with Negation: An Overview. Comput. Linguist. 2020, 46, 1–52. [Google Scholar] [CrossRef]
- Councill, I.G.; McDonald, R.; Velikovich, L. What’s great and what’s not: Learning to classify the scope of negation for improved sentiment analysis. In Proceedings of the Workshop on Negation and Speculation in Natural Language Processing, Uppsala, Sweden, 10 July 2010; pp. 51–59. [Google Scholar]
- Jespersen, O. Negation in English and Other Languages; Host: Kobenhavn, Denmark, 1917. [Google Scholar]
- Horn, L. A Natural History of Negation; University of Chicago Press: Chicago, IL, USA, 1989. [Google Scholar]
- Morante, R.; Sporleder, C. Modality and negation: An introduction to the special issue. Comput. Linguist. 2012, 38, 223–260. [Google Scholar] [CrossRef]
- Laka, I. Negation in syntax: On the nature of functional categories and projections. Anu. Semin. Filol. Vasca Julio Urquijo 1990, 25, 65–136. [Google Scholar]
- Tottie, G. Negation in English Speech and Writing: A Study in Variation; Academic Press: Cambridge, MA, USA, 1991; Volume 4. [Google Scholar]
- Huddleston, R.D.; Pullum, G.K. The Cambridge Grammar of the English Language; Cambridge University Press: Cambridge, UK; New York, NY, USA, 2002. [Google Scholar]
- Liu, B. Sentiment Analysis: Mining Opinions, Sentiments and Emotions; Cambridge University Press: Cambridge, UK, 2015. [Google Scholar]
- De Albornoz, J.C.; Plaza, L.; Díaz, A.; Ballesteros, M. Ucm-i: A rule-based syntactic approach for resolving the scope of negation. In Proceedings of the First Joint Conference on Lexical and Computational Semantics-Volume 1: Proceedings of the Main Conference and the Shared Task and Volume 2: Proceedings of the Sixth International Workshop on Semantic Evaluation, Montréal, QC, Canada, 7–8 June 2012; pp. 282–287. [Google Scholar]
- Ballesteros, M.; Francisco, V.; Díaz, A.; Herrera, J.; Gervás, P. Inferring the scope of negation in biomedical documents. In Proceedings of the International Conference on Intelligent Text Processing and Computational Linguistics; Springer: Berlin/Heidelberg, Germany, 2012; pp. 363–375. [Google Scholar]
- Cruz, N.P.; Taboada, M.; Mitkov, R. A machine learning approach to negation and speculation detection for sentiment analysis. J. Assoc. Inf. Sci. Technol. 2016, 67, 2118–2136. [Google Scholar] [CrossRef] [Green Version]
- Skeppstedt, M.; Paradis, C.; Kerren, A. Marker Words for Negation and Speculation in Health Records and Consumer Reviews. In Proceedings of the 7th International Symposium on Semantic Mining in Biomedicine (SMBM), Potsdam, Germany, 4–5 August 2016; pp. 64–69. [Google Scholar]
- Qian, Z.; Li, P.; Zhu, Q.; Zhou, G.; Luo, Z.; Luo, W. Speculation and negation scope detection via convolutional neural networks. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, Austin, TX, USA, 1–5 November 2016; pp. 815–825. [Google Scholar]
- Lazib, L.; Zhao, Y.; Qin, B.; Liu, T. Negation scope detection with recurrent neural networks models in review texts. Int. J. High Perform. Comput. Netw. 2019, 13, 211–221. [Google Scholar] [CrossRef]
- Chapman, W.W.; Bridewell, W.; Hanbury, P.; Cooper, G.F.; Buchanan, B.G. A simple algorithm for identifying negated findings and diseases in discharge summaries. J. Biomed. Inform. 2001, 34, 301–310. [Google Scholar] [CrossRef] [Green Version]
- Harkema, H.; Dowling, J.N.; Thornblade, T.; Chapman, W.W. ConText: An algorithm for determining negation, experiencer and temporal status from clinical reports. J. Biomed. Inform. 2009, 42, 839–851. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gindl, S.; Kaiser, K.; Miksch, S. Syntactical negation detection in clinical practice guidelines. Stud. Health Technol. Inform. 2008, 136, 187. [Google Scholar] [PubMed]
- Konstantinova, N.; De Sousa, S.C.; Díaz, N.P.C.; López, M.J.M.; Taboada, M.; Mitkov, R. A review corpus annotated for negation, speculation and their scope. In Proceedings of the LREC, Istanbul, Turkey, 21–27 May 2012; pp. 3190–3195. [Google Scholar]
- Reitan, J.; Faret, J.; Gambäck, B.; Bungum, L. Negation scope detection for Twitter sentiment analysis. In Proceedings of the 6th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, Lisbon, Portugal, 17 September 2015; pp. 99–108. [Google Scholar]
- Kolhatkar, V.; Wu, H.; Cavasso, L.; Francis, E.; Shukla, K.; Taboada, M. The SFU opinion and comments corpus: A corpus for the analysis of online news comments. Corpus Pragmat. 2019, 4, 155–190. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Elazhary, H. Negminer: An automated tool for mining negations from electronic narrative medical documents. Int. J. Intell. Syst. Appl. 2017, 9, 14. [Google Scholar] [CrossRef] [Green Version]
- Marcus, M.P.; Marcinkiewicz, M.A. Building a Large Annotated Corpus of English: The Penn Treebank. Comput. Linguist. 1994, 19, 273. [Google Scholar]
- Taylor, A.; Marcus, M.; Santorini, B. The Penn Treebank: An overview. In Treebanks; Springer: Berlin/Heidelberg, Germany, 2003; pp. 5–22. [Google Scholar]
- Santorini, B. Part-of-Speech Tagging Guidelines for the Penn Treebank Project (3rd Revision); Tech. Rep. (CIS); University of Pennsylvania: Philadelphia, PA, USA, 1990; MS-CIS-90-47. [Google Scholar]
- Misuraca, M.; Forciniti, A.; Scepi, G.; Spano, M. Sentiment Analysis for Education with R: Packages, methods and practical applications. arXiv 2020, arXiv:2005.12840. [Google Scholar]
- Naldi, M. A review of sentiment computation methods with R packages. arXiv 2019, arXiv:1901.08319. [Google Scholar]
- Bose, S.; Saha, U.; Kar, D.; Goswami, S.; Nayak, A.K.; Chakrabarti, S. RSentiment: A tool to extract meaningful insights from textual reviews. In Proceedings of the 5th International Conference on Frontiers in Intelligent Computing: Theory and Applications; Springer: Berlin/Heidelberg, Germany, 2017; pp. 259–268. [Google Scholar]
- Jockers, M. Package ‘Syuzhet’. 2017. Available online: https://cran.r-project.org/web/packages/syuzhet/index.html (accessed on 10 January 2023).
- Valdivia, A.; Luzón, M.V.; Herrera, F. Sentiment analysis in Tripadvisor. IEEE Intell. Syst. 2017, 32, 72–77. [Google Scholar] [CrossRef]
- Liu, D.; Lei, L. The appeal to political sentiment: An analysis of Donald Trump’s and Hillary Clinton’s speech themes and discourse strategies in the 2016 US presidential election. Discourse Context Media 2018, 25, 143–152. [Google Scholar] [CrossRef]
- Yoon, S.; Parsons, F.; Sundquist, K.; Julian, J.; Schwartz, J.E.; Burg, M.M.; Davidson, K.W.; Diaz, K.M. Comparison of different algorithms for sentiment analysis: Psychological stress notes. Stud. Health Technol. Inform. 2017, 245, 1292. [Google Scholar]
- Rinker, T. Package ‘Sentimentr’. 2017. Available online: https://cran.r-project.org/web/packages/sentimentr/index.html (accessed on 10 January 2023).
- Ikoro, V.; Sharmina, M.; Malik, K.; Batista-Navarro, R. Analyzing sentiments expressed on Twitter by UK energy company consumers. In Proceedings of the 2018 Fifth International Conference on Social Networks Analysis, Management and Security (SNAMS), Valencia, Spain, 15–18 October 2018; pp. 95–98. [Google Scholar]
- Weissman, G.E.; Ungar, L.H.; Harhay, M.O.; Courtright, K.R.; Halpern, S.D. Construct validity of six sentiment analysis methods in the text of encounter notes of patients with critical illness. J. Biomed. Inform. 2019, 89, 114–121. [Google Scholar] [CrossRef]
- Pröllochs, N.; Feuerriegel, S.; Neumann, D. Generating Domain-Specific Dictionaries using Bayesian Learning. In Proceedings of the European Conference on Information Systems (ECIS), Münster, Germany, 26–29 May 2015. [Google Scholar]
- Kassraie, P.; Modirshanechi, A.; Aghajan, H.K. Election Vote Share Prediction using a Sentiment-based Fusion of Twitter Data with Google Trends and Online Polls. In Proceedings of the 6th International Conference on Data Science, Technology and Applications (DATA 2017), Madrid, Spain, 24–26 July 2017; pp. 363–370. [Google Scholar]
- Haghighi, N.N.; Liu, X.C.; Wei, R.; Li, W.; Shao, H. Using Twitter data for transit performance assessment: A framework for evaluating transit riders’ opinions about quality of service. Public Transp. 2018, 10, 363–377. [Google Scholar] [CrossRef]
- Hu, M.; Liu, B. Mining opinion features in customer reviews. In Proceedings of the AAAI, San Jose, CA, USA, 25–29 July 2004; Volume 4, pp. 755–760. [Google Scholar]
- Hutto, C.; Gilbert, E. Vader: A parsimonious rule-based model for sentiment analysis of social media text. In Proceedings of the International AAAI Conference on Weblogs and Social Media, Ann Arbor, MI, USA, 1–4 June 2014; Volume 8. [Google Scholar]
- Newman, H.; Joyner, D. Sentiment analysis of student evaluations of teaching. In Proceedings of the International Conference on Artificial Intelligence in Education; Springer: Berlin/Heidelberg, Germany, 2018; pp. 246–250. [Google Scholar]
- Borg, A.; Boldt, M. Using VADER sentiment and SVM for predicting customer response sentiment. Expert Syst. Appl. 2020, 162, 113746. [Google Scholar] [CrossRef]
- Wang, Z.; Ho, S.B.; Cambria, E. Multi-level fine-scaled sentiment sensing with ambivalence handling. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 2020, 28, 683–697. [Google Scholar] [CrossRef]
- Seliya, N.; Khoshgoftaar, T.M.; Hulse, J.V. A Study on the Relationships of Classifier Performance Metrics. In Proceedings of the 2009 21st IEEE International Conference on Tools with Artificial Intelligence, Newark, NJ, USA, 2–5 November 2009; pp. 59–66. [Google Scholar]
- Zaki, M.J.; Meira Jr, W. Data Mining and Machine Learning: Fundamental Concepts and Algorithms; Cambridge University Press: Cambridge, UK, 2019. [Google Scholar]
- Murphy, K.P. Machine Learning: A Probabilistic Perspective; MIT Press: Cambridge, MA, USA, 2012. [Google Scholar]
Author(s) | Year | Source | Purpose & Major Themes |
---|---|---|---|
Councill et al. [29] | 2010 | P | Classifying the scope of negation for improved sentiment analysis. Importance of negation-scope detection in terms of positively affecting the performance of the sentiment analysis |
Wiegand et al. [22] | 2010 | P | Investigating the role of negation in sentiment analysis. The authors claim that an effective negation model for sentiment analysis requires knowledge of polarity expressions |
Jespersen [30] | 1917 | B | Investigating negations in English and in other languages. One of the first exhaustive theoretical studies on the main issues of negation |
Horn [31] | 1989 | B | Providing a comprehensive overview of the taxonomies of negation. One of the first exhaustive theoretical studies on the main issues of negation. |
Morante et al. [32] | 2012 | A | Investigating modality and negation. Negation detection in sentiment analysis and opinion mining |
Laka [33] | 2013 | A | Exploring the nature of functional categories and projections of negation in syntax. Negation detection in sentiment analysis and opinion mining |
Tottie [34] | 1991 | B | Providing an overview of negation in English speech and writing and taxonomies for analysing English negation (with focus on clausal negations) |
Huddleston & Pullum [35] | 2002 | B | Providing a comprehensive taxonomy of negation in English, using four binomial categories: (1) verbal vs. non-verbal; (2) analytic vs. synthetic negation; (3) clausal vs. subclausal negation; (4) ordinary vs. metalinguistic negation |
Liu [36] | 2015 | B | Giving a comprehensive overview of the state of the art of sentiment analysis from a primarily natural language processing point of view. The book covers all core areas of sentiment analysis, including debate analysis, intention mining and fake-opinion detection. It provides computational methods to analyse and summarise opinions. |
de Albornoz et al. [37] | 2012 | P | Providing a rule-based syntactic approach to resolving the scope of negation. Negation detection is investigated through the rule-based approach |
Ballesteros et al. [38] | 2012 | P | Investigating the scope of negation in biomedical documents. Negation detection is investigated through the rule-based approach |
Cruz et al. [39] | 2016 | A | Providing a machine learning approach to negation and speculation detection for sentiment analysis. Negation detection is investigated through machine learning techniques |
Lapponi et al. [23] | 2012 | P | Providing tools for resolving negation for sentiment analysis. Negation detection is investigated through machine learning techniques |
Skeppstedt et al. [40] | 2016 | P | Investigating the scope of negation in Health Records and Consumer Reviews. Negation detection is investigated through machine learning techniques |
Qian et al. [41] | 2016 | P | Speculating on negation scope detection via convolutional neural networks. This contribution demonstrates the efficiency of the deep learning algorithms when applied to the negation-detection process |
Lazib et al. [42] | 2019 | A | Investigating how to detect negation scope via recurrent neural network models. This article demonstrates the efficiency of the deep learning algorithms when applied to the negation-detection process |
Package | Downloads (×1000) |
---|---|
syuzhet | 1100 |
sentimentR | 244 |
SentimentAnalysis | 104 |
RSentiment | 56 |
meanr | 28 |
VADER | 15 |
texter | 2 |
Estimated Polarity | |||
---|---|---|---|
+ | − | ||
True | + | True positive | False negative |
Polarity | − | False positive | True negative |
Accuracy | Sensitivity | Specificity | Precision | F-Score | |
---|---|---|---|---|---|
sentimentR | 0.719 | 0.833 * | 0.692 | 0.395 | 0.526 |
syuzhet/syuzhet | 0.188 | 0.667 | 0.077 | 0.143 | 0.235 |
syuzhet/afinn | 0.156 | 0.667 | 0.038 | 0.138 | 0.229 |
syuzhet/bing | 0.188 | 0.667 | 0.077 | 0.143 | 0.235 |
SentimentAnalysis | 0.219 | 0.667 | 0.115 | 0.148 | 0.242 |
meanr | 0.25 | 0.667 | 0.154 | 0.154 | 0.25 |
VADER | 0.688 | 0.8 | 0.667 | 0.308 | 0.444 |
RSentiment | 0.781 * | 0.833 * | 0.769 * | 0.455 * | 0.588 * |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Naldi, M.; Petroni, S. A Testset-Based Method to Analyse the Negation-Detection Performance of Lexicon-Based Sentiment Analysis Tools. Computers 2023, 12, 18. https://doi.org/10.3390/computers12010018
Naldi M, Petroni S. A Testset-Based Method to Analyse the Negation-Detection Performance of Lexicon-Based Sentiment Analysis Tools. Computers. 2023; 12(1):18. https://doi.org/10.3390/computers12010018
Chicago/Turabian StyleNaldi, Maurizio, and Sandra Petroni. 2023. "A Testset-Based Method to Analyse the Negation-Detection Performance of Lexicon-Based Sentiment Analysis Tools" Computers 12, no. 1: 18. https://doi.org/10.3390/computers12010018
APA StyleNaldi, M., & Petroni, S. (2023). A Testset-Based Method to Analyse the Negation-Detection Performance of Lexicon-Based Sentiment Analysis Tools. Computers, 12(1), 18. https://doi.org/10.3390/computers12010018