Next Article in Journal
A Two-Step Process for Improved Biomass Production and Non-Destructive Astaxanthin and Carotenoids Accumulation in Haematococcus pluvialis
Next Article in Special Issue
Improving Graph-Based Movie Recommender System Using Cinematic Experience
Previous Article in Journal
Bioelectroanalytical Detection of Lactic Acid Bacteria
Previous Article in Special Issue
Automatic Construction of Fine-Grained Paraphrase Corpora System Using Language Inference Model
Article

Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences

1
Faculty of Software and Information Technology, Aomori University, Aomori 0300943, Japan
2
Graduate School of Technology, Industrial and Social Sciences, Tokushima University, Tokushima 7708506, Japan
*
Authors to whom correspondence should be addressed.
Academic Editor: Alexandros A. Lavdas
Appl. Sci. 2022, 12(3), 1256; https://doi.org/10.3390/app12031256
Received: 17 December 2021 / Revised: 12 January 2022 / Accepted: 21 January 2022 / Published: 25 January 2022
This paper proposes an emotion recognition method for tweets containing emoticons using their emoticon image and language features. Some of the existing methods register emoticons and their facial expression categories in a dictionary and use them, while other methods recognize emoticon facial expressions based on the various elements of the emoticons. However, highly accurate emotion recognition cannot be performed unless the recognition is based on a combination of the features of sentences and emoticons. Therefore, we propose a model that recognizes emotions by extracting the shape features of emoticons from their image data and applying the feature vector input that combines the image features with features extracted from the text of the tweets. Based on evaluation experiments, the proposed method is confirmed to achieve high accuracy and shown to be more effective than methods that use text features only. View Full-Text
Keywords: emoticon; emotion estimation; multimodal information emoticon; emotion estimation; multimodal information
Show Figures

Figure 1

MDPI and ACS Style

Fujisawa, A.; Matsumoto, K.; Yoshida, M.; Kita, K. Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences. Appl. Sci. 2022, 12, 1256. https://doi.org/10.3390/app12031256

AMA Style

Fujisawa A, Matsumoto K, Yoshida M, Kita K. Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences. Applied Sciences. 2022; 12(3):1256. https://doi.org/10.3390/app12031256

Chicago/Turabian Style

Fujisawa, Akira, Kazuyuki Matsumoto, Minoru Yoshida, and Kenji Kita. 2022. "Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences" Applied Sciences 12, no. 3: 1256. https://doi.org/10.3390/app12031256

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop