A Gloss Composition and Context Clustering Based Distributed Word Sense Representation Model
AbstractIn recent years, there has been an increasing interest in learning a distributed representation of word sense. Traditional context clustering based models usually require careful tuning of model parameters, and typically perform worse on infrequent word senses. This paper presents a novel approach which addresses these limitations by first initializing the word sense embeddings through learning sentence-level embeddings from WordNet glosses using a convolutional neural networks. The initialized word sense embeddings are used by a context clustering based model to generate the distributed representations of word senses. Our learned representations outperform the publicly available embeddings on half of the metrics in the word similarity task, 6 out of 13 sub tasks in the analogical reasoning task, and gives the best overall accuracy in the word sense effect classification task, which shows the effectiveness of our proposed distributed distribution learning model. View Full-Text
Scifeed alert for new publicationsNever miss any articles matching your research from any publisher
- Get alerts for new papers matching your research
- Find out the new papers from selected authors
- Updated daily for 49'000+ journals and 6000+ publishers
- Define your Scifeed now
Chen, T.; Xu, R.; He, Y.; Wang, X. A Gloss Composition and Context Clustering Based Distributed Word Sense Representation Model. Entropy 2015, 17, 6007-6024.
Chen T, Xu R, He Y, Wang X. A Gloss Composition and Context Clustering Based Distributed Word Sense Representation Model. Entropy. 2015; 17(9):6007-6024.Chicago/Turabian Style
Chen, Tao; Xu, Ruifeng; He, Yulan; Wang, Xuan. 2015. "A Gloss Composition and Context Clustering Based Distributed Word Sense Representation Model." Entropy 17, no. 9: 6007-6024.