skip to main content
10.1145/3341105.3374075acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
poster

Personalizing large-scale text classification by modeling individual differences

Published: 30 March 2020 Publication History

Abstract

Large-scale text classification is used to organize and subsequently, analyze textual information into a variety of topics effectively. However, most of existing large-scale text classification models tend to draw similar classification results without accounting for the differences in individual perceptions, as may be discernable through the text semantics based on distinct human characteristics. In this paper, we propose a personalized large-scale text classification model, which factors in these individual differences when classifying data.

References

[1]
Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long Short-Term Memory. Neural Computation 9, 8 (1997), 1735--1780.
[2]
Kang-Min Kim, Aliyeva Dinara, Byung-Ju Choi, and SangKeun Lee. 2018. Incorporating Word Embeddings into Open Directory Project based Large-scale Classification. In Proceedings of Pacific-Asia Conference on Knowledge Discovery and Data Mining. 376--388.
[3]
Yoon Kim. 2014. Convolutional Neural Networks for Sentence Classification. In Proceedings of Empirical Methods in Natural Language Processing. 1746--1751.
[4]
Quoc Le and Tomas Mikolov. 2014. Distributed Representations of Sentences and Documents. In Proceedings of International Conference on Machine Learning. 1188--1196.
[5]
Jung-Hyun Lee, Jongwoo Ha, Jin-Yong Jung, and SangKeun Lee. 2013. Semantic Contextual Advertising based on the Open Directory Project. ACM Transactions on the Web 7, 4 (2013), 24:1--24:22.
[6]
Ji-Min Lee, Kang-Min Kim, Yeachan Kim, and SangKeun Lee. 2018. Improving Open Directory Project-Based Text Classification with Hierarchical Category Embedding. In Proceedings of IEEE International Conference on Cognitive Informatics & Cognitive Computing. 246--253.
[7]
George Mather. 2006. Foundations of Perception. Taylor & Francis.
[8]
Giancarlo Salton, Robert Ross, and John Kelleher. 2017. Attentive Language Models. In Proceedings of International Joint Conference on Natural Language Processing. 441--450.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SAC '20: Proceedings of the 35th Annual ACM Symposium on Applied Computing
March 2020
2348 pages
ISBN:9781450368667
DOI:10.1145/3341105
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 March 2020

Check for updates

Author Tags

  1. large-scale text classification
  2. personalization
  3. user modeling

Qualifiers

  • Poster

Funding Sources

Conference

SAC '20
Sponsor:
SAC '20: The 35th ACM/SIGAPP Symposium on Applied Computing
March 30 - April 3, 2020
Brno, Czech Republic

Acceptance Rates

Overall Acceptance Rate 1,650 of 6,669 submissions, 25%

Upcoming Conference

SAC '25
The 40th ACM/SIGAPP Symposium on Applied Computing
March 31 - April 4, 2025
Catania , Italy

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 205
    Total Downloads
  • Downloads (Last 12 months)4
  • Downloads (Last 6 weeks)0
Reflects downloads up to 15 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media