Abstract
Neural network approaches are end-to-end learning approaches without well-designed training data and achieve high performance in sentiment analysis. Because of complex architecture of a neural network, it is difficult to analyze how they work and find their bottleneck to improve their performance. To remedy it, we propose neural sentiment analysis with attention mechanism. Using attention mechanism, we can find important words to determine sentiment polarity of a sentence. Moreover, we can understand why the sentiment analysis could not classify sentiment polarity correctly. We compare our method with neural sentiment analysis without attention mechanism over TSUKUBA corpus and Stanford Sentiment Treebank (SST). Experimental results show that our method is interpretable and can achieve higher precision.






Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Pang B, Lee L, Vaithyanathan S (2002) Thumbs up?: sentiment classification using machine learning techniques. In: Procs. of the 2002 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp 79–86
Pang B, Lee L (2008) Opinion mining and sentiment analysis. Found Trends Inform Retr 2:1–135
Taboada M, Brooke J, Tofiloski M, Voll K, Stede M (2011) Lexicon-based methods for sentiment analysis. Comput Linguist 37:267–307
Ding X, Liu B, Yu PS (2008) A holistic lexicon-based approach to opinion mining. In: Procs. of the 2008 International Conference on Web Search and Data Mining (WSDM), pp 231–240
Socher R, Perelygin A, Wu J, Chuang J, Manning CD, Ng A, Potts C (2013) Recursive deep models for semantic compositionality over a sentiment treebank. In: Procs. of the 2013 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp 1631–1642
Kalchbrenner N, Grefenstette E, Blunsom P (2014) A convolutional neural network for modelling sentences. In: rocs. of the 52nd Annual Meeting of the Association for Computational Linguistics (ACL), pp 655–665
Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9:1735–1780
Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J (2013) Distributed representations of words and phrases and their compositionality. In: Procs. of the 26th International Conference on Neural Information Processing Systems (NIPS), pp 3111–3119
Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointry learning to align and translate. arXiv:1409.0473
Kelvin X, Ba J, Kiros R, Courville A, Salakhutdinov R, Zemel R, Bengio Y (2015) Show attend and tell: neural image caption generation with visual attention. Procs. Mach Learn Res 37:2048–2057
Yang Z, He X, Gao J, Deng L, Smola A (2015) Stacked attention networks for image question answering. arXiv:1511.02274
Rush AM, Chopra S, Weston J (2015) A neural attention model for sentence summarization. In: Procs. of the 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp 379–389
Schuster M, Paliwal KK (1997) Networks bidirectional reccurent neural. IEEE Trans Signal Proces 45:2673–2681
Bengio Y, Simard P, Frasconi P (1994) Learning long-term dependencies with gradient descent is difficult. IEEE Trans Neural Netws 5:157–166
Tokui S, Oono K, Hido S, Clayton J (2015) Chainer:a next-generation open source framework for deep learning. In: Procs. of workshop on machine learning systems (learningsys) in the twenty-ninth annual conference on neural information processing systems (NIPS)
Kingma D, Ba J (2014) Adam: a method for stochastic optimization. arXiv:1412.6980
Le Q, Mikolov T (2014) Distributed representations of sentences and documents. In: Procs. of the 31st international conference on machine learning (ICML)
Author information
Authors and Affiliations
Corresponding author
About this article
Cite this article
Hiyama, Y., Yanagimoto, H. Word polarity attention in sentiment analysis. Artif Life Robotics 23, 311–315 (2018). https://doi.org/10.1007/s10015-018-0439-9
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10015-018-0439-9