Loading [a11y]/accessibility-menu.js
AtRec: Accelerating Recommendation Model Training on CPUs | IEEE Journals & Magazine | IEEE Xplore

AtRec: Accelerating Recommendation Model Training on CPUs


Abstract:

The popularity of recommendation models and the enhanced AI processing capability of CPUs have provided massive performance opportunities to deliver satisfactory experien...Show More

Abstract:

The popularity of recommendation models and the enhanced AI processing capability of CPUs have provided massive performance opportunities to deliver satisfactory experiences to a large number of users. Unfortunately, existing recommendation model training methods fail to achieve high efficiency due to unique challenges such as dynamic shape and high parallelism. To address the above limitations, we comprehensively study the distinctive characteristics of recommendation models and discover several unexploited optimization opportunities. To exploit such opportunities, we propose AtRec, a high-performant recommendation model training engine that significantly accelerates the training process on CPUs. Specifically, AtRec presents comprehensive approach of training that employs operator-level and graph-level joint optimizations and runtime optimization. At the operator-level, AtRec identifies and optimizes the time-consuming operators, which enables further efficient graph-level optimizations. At the graph-level, AtRec conducts an in-depth analysis of the inefficiencies in several frequently used subgraphs, enables further performance improvement via eliminating redundant computations and memory accesses. In addition, to achieve better runtime performance, AtRec also identifies inefficiencies prevalent in the current scheduling and proposes runtime batching. The experiment results demonstrate that AtRec can significantly outperform state-of-the-art recommendation model training engines. We have open sourced the implementation and corresponding data of AtRec to boost research in this direction.
Published in: IEEE Transactions on Parallel and Distributed Systems ( Volume: 35, Issue: 6, June 2024)
Page(s): 905 - 918
Date of Publication: 25 March 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.