Publication Type

Journal Article

Version

publishedVersion

Publication Date

2-2019

Abstract

Hashtags provide a simple and natural way of organizing content in microblog services. Along with the fast growing of microblog services, the task of recommending hashtags for microblogs has been given increasing attention in recent years. However, much of the research depends on hand-crafted features. Motivated by the successful use of neural models for many natural language processing tasks, in this paper, we adopt an attention based neural network to learn the representation of a microblog post. Unlike previous works, which only focus on content attention of microblogs, we propose a novel Topical CoAttention Network (TCAN) that jointly models content attention and topic attention simultaneously, in the sense that the content representation(s) are used to guide the topic attention and the topic representation is used to guide content attention. We conduct experiments and test with different settings of TCAN on a large real-world dataset. Experimental results show that our model significantly outperforms various competitive baseline methods. Furthermore, the incorporation of topical co-attention mechanism gives more than 13.6% improvement in F1 score compared with the standard LSTM based methods.

Keywords

Hashtag recommendation, Long short-term memory, Co-attention, Topic model

Discipline

Computer Engineering | Programming Languages and Compilers

Research Areas

Data Science and Engineering

Publication

Neurocomputing

Volume

331

First Page

356

Last Page

365

ISSN

0925-2312

Identifier

10.1016/j.neucom.2018.11.057

Publisher

Elsevier

Additional URL

https://doi.org/10.1016/j.neucom.2018.11.057

Share

COinS