International Workshop on Deep Learning Practice for High-Dimensional Sparse Data with RecSys 2023
Pages 1276 - 1280
![First page of PDF](/cms/10.1145/3604915.3608765/asset/3ce26525-00c9-48ef-ae4d-0d25c075c747/assets/3604915.3608765.fp.png)
References
[1]
Huifeng Guo, Ruiming Tang, Yunming Ye, Zhenguo Li, and Xiuqiang He. 2017. DeepFM: a factorization-machine based neural network for CTR prediction. In Proceedings of the 26th International Joint Conference on Artificial Intelligence. 1725–1731.
[2]
Weiwen Liu, Yunjia Xi, Jiarui Qin, Fei Sun, Bo Chen, Weinan Zhang, Rui Zhang, and Ruiming Tang. 2022. Neural re-ranking in multi-stage recommender systems: A review. arXiv preprint arXiv:2202.06602 (2022).
[3]
Jongsoo Park, Maxim Naumov, Protonu Basu, Summer Deng, Aravind Kalaiah, Daya Khudia, James Law, Parth Malani, Andrey Malevich, Satish Nadathur, 2018. Deep learning inference in facebook data centers: Characterization, performance optimizations and hardware implications. arXiv preprint arXiv:1811.09886 (2018).
[4]
Ziyang Tang, Yiheng Duan, Steven Zhu, Stephanie Zhang, and Lihong Li. 2022. Estimating Long-term Effects from Experimental Data. In Proceedings of the 16th ACM Conference on Recommender Systems. 516–518.
[5]
Jiajing Xu, Andrew Zhai, and Charles Rosenberg. 2022. Rethinking Personalized Ranking at Pinterest: An End-to-End Approach. In Proceedings of the 16th ACM Conference on Recommender Systems. 502–505.
[6]
Guorui Zhou, Na Mou, Ying Fan, Qi Pi, Weijie Bian, Chang Zhou, Xiaoqiang Zhu, and Kun Gai. 2019. Deep interest evolution network for click-through rate prediction. In Proceedings of the AAAI conference on artificial intelligence, Vol. 33. 5941–5948.
[7]
Guorui Zhou, Xiaoqiang Zhu, Chenru Song, Ying Fan, Han Zhu, Xiao Ma, Yanghui Yan, Junqi Jin, Han Li, and Kun Gai. 2018. Deep interest network for click-through rate prediction. In Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining. 1059–1068.
Index Terms
- International Workshop on Deep Learning Practice for High-Dimensional Sparse Data with RecSys 2023
Index terms have been assigned to the content through auto-classification.
Comments
Information & Contributors
Information
Published In
![cover image ACM Conferences](/cms/asset/c62a0a4a-67ca-4d0f-a82d-3b2199595c63/3604915.cover.jpg)
September 2023
1406 pages
ISBN:9798400702419
DOI:10.1145/3604915
Copyright © 2023 Owner/Author.
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.
Sponsors
- SIGWEB: ACM Special Interest Group on Hypertext, Hypermedia, and Web
- SIGAI: ACM Special Interest Group on Artificial Intelligence
- SIGKDD: ACM Special Interest Group on Knowledge Discovery in Data
- SIGIR: ACM Special Interest Group on Information Retrieval
- SIGCHI: ACM Special Interest Group on Computer-Human Interaction
- SIGecom: Special Interest Group on Economics and Computation
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Published: 14 September 2023
Check for updates
Qualifiers
- Extended-abstract
- Research
- Refereed limited
Conference
RecSys '23: Seventeenth ACM Conference on Recommender Systems
September 18 - 22, 2023
Singapore, Singapore
Acceptance Rates
Overall Acceptance Rate 254 of 1,295 submissions, 20%
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 144Total Downloads
- Downloads (Last 12 months)52
- Downloads (Last 6 weeks)1
Reflects downloads up to 14 Feb 2025
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign inFull Access
View options
View or Download as a PDF file.
PDFeReader
View online with eReader.
eReaderHTML Format
View this article in HTML Format.
HTML Format