Skip to main content

Neural CTR Prediction for Native Ad

  • Conference paper
  • First Online:
Book cover Chinese Computational Linguistics (CCL 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11856))

Included in the following conference series:

Abstract

Native ad is an important kind of online advertising which has similar form with the other content in the same platform. Compared with search ad, predicting the click-through rate (CTR) of native ad is more challenging, since there is no explicit user intent. Learning accurate representations of users and ads that can capture user interests and ad characteristics is critical to this task. Existing methods usually rely on single kind of user behavior for user modeling and ignore the textual information in ads and user behaviors. In this paper, we propose a neural approach for native ad CTR prediction which can incorporate different kinds of user behaviors to model user interests, and can fully exploit the textual information in ads and user behaviors to learn accurate ad and user representations. The core of our approach is an ad encoder and a user encoder. In the ad encoder we learn representations of ads from their titles and descriptions. In the user encoder we propose a mult-view framework to learn representations of users from both their search queries and their browsed webpages by regarding different kinds of behaviors as different views of users. In each view we learn user representations using a hierarchical model and use attention to select important words, search queries and webpages. Experiments on a real-world dataset validate that our approach can effectively improve the performance of native ad CTR prediction.

This work was done when the first author was an intern in Microsoft Research Asia.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.msn.com/en-us.

References

  1. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: ICLR (2015)

    Google Scholar 

  2. Chakrabarti, D., Agarwal, D., Josifovski, V.: Contextual advertising by combining relevance with click feedback. In: WWW, pp. 417–426 (2008)

    Google Scholar 

  3. Chen, J., Sun, B., Li, H., Lu, H., Hua, X.S.: Deep CTR prediction in display advertising. In: MM, pp. 811–820 (2016)

    Google Scholar 

  4. Cheng, H.T., et al.: Wide and deep learning for recommender systems. In: Proceedings of the 1st Workshop on Deep Learning for Recommender Systems, pp. 7–10 (2016)

    Google Scholar 

  5. Covington, P., Adams, J., Sargin, E.: Deep neural networks for YouTube recommendations. In: RecSys, pp. 191–198 (2016)

    Google Scholar 

  6. Dave, K.S., Varma, V.: Learning the click-through rate for rare/new ads from similar ads. In: SIGIR, pp. 897–898 (2010)

    Google Scholar 

  7. Gai, K., Zhu, X., Li, H., Liu, K., Wang, Z.: Learning piece-wise linear models from large scale data for ad click prediction. arXiv preprint arXiv:1704.05194 (2017)

  8. Guo, H., Tang, R., Ye, Y., Li, Z., He, X.: DeepFM: a factorization-machine based neural network for CTR prediction. arXiv preprint arXiv:1703.04247 (2017)

  9. He, X., et al.: Practical lessons from predicting clicks on ads at Facebook. In: Proceedings of the Eighth International Workshop on Data Mining for Online Advertising, pp. 1–9 (2014)

    Google Scholar 

  10. Huang, P.S., He, X., Gao, J., Deng, L., Acero, A., Heck, L.: Learning deep structured semantic models for web search using clickthrough data. In: CIKM, pp. 2333–2338 (2013)

    Google Scholar 

  11. Kim, Y.: Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882 (2014)

  12. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  13. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436 (2015)

    Article  Google Scholar 

  14. Matteo, S., Zotto, C.D.: Native advertising, or how to stretch editorial to sponsored content within a transmedia branding era. In: Siegert, G., Förster, K., Chan-Olmsted, S.M., Ots, M. (eds.) Handbook of Media Branding, pp. 169–185. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-18236-0_12

    Chapter  Google Scholar 

  15. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)

  16. Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: NIPS, pp. 3111–3119 (2013)

    Google Scholar 

  17. Oentaryo, R.J., Lim, E.P., Low, J.W., Lo, D., Finegold, M.: Predicting response in mobile advertising with hierarchical importance-aware factorization machine. In: WSDM, pp. 123–132 (2014)

    Google Scholar 

  18. Parsana, M., Poola, K., Wang, Y., Wang, Z.: Improving native ads CTR prediction by large scale event embedding and recurrent networks. arXiv preprint arXiv:1804.09133 (2018)

  19. Rendle, S.: Factorization machines with libFM. ACM Trans. Intell. Syst. Technol. 3(3), 57 (2012)

    Article  Google Scholar 

  20. Richardson, M., Dominowska, E., Ragno, R.: Predicting clicks: estimating the click-through rate for new ads. In: WWW, pp. 521–530 (2007)

    Google Scholar 

  21. Salton, G., McGill, M.J.: Introduction to modern information retrieval (1986)

    Google Scholar 

  22. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  23. Williams, R.J., Zipser, D.: A learning algorithm for continually running fully recurrent neural networks. Neural Comput. 1(2), 270–280 (1989)

    Article  Google Scholar 

  24. Zhai, S., Chang, K., Zhang, R., Zhang, Z.M.: Deepintent: learning attentions for online advertising with recurrent neural networks. In: KDD, pp. 1295–1304. ACM (2016)

    Google Scholar 

  25. Zhou, G., et al.: Deep interest network for click-through rate prediction. In: KDD, pp. 1059–1068 (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mingxiao An .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

An, M., Wu, F., Wang, H., Di, T., Huang, J., Xie, X. (2019). Neural CTR Prediction for Native Ad. In: Sun, M., Huang, X., Ji, H., Liu, Z., Liu, Y. (eds) Chinese Computational Linguistics. CCL 2019. Lecture Notes in Computer Science(), vol 11856. Springer, Cham. https://doi.org/10.1007/978-3-030-32381-3_48

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-32381-3_48

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-32380-6

  • Online ISBN: 978-3-030-32381-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics