Real-Time Optimized N-Gram for Mobile Devices | IEEE Conference Publication | IEEE Xplore

Real-Time Optimized N-Gram for Mobile Devices


Abstract:

With the increasing number of mobile devices, there has been continuous research on generating optimized Language Models (LMs) for soft keyboard. In spite of advances in ...Show More

Abstract:

With the increasing number of mobile devices, there has been continuous research on generating optimized Language Models (LMs) for soft keyboard. In spite of advances in this domain, building a single LM for low-end feature phones as well as high-end smartphones is still a pressing need. Hence, we propose a novel technique, Optimized N-gram (Op-Ngram), an end-to-end N-gram pipeline that utilises mobile resources efficiently for faster Word Completion (WC) and Next Word Prediction (NWP). Op-Ngram applies Stupid Backoff [1] and pruning strategies to generate a light-weight model. The LM loading time on mobile is linear with respect to model size. We observed that Op-Ngram gives 37% improvement in Language Model (LM)-ROM size, 76% in LM-RAM size, 88% in loading time and 89% in average suggestion time as compared to Sorted array variant of BerkeleyLM[2]. Moreover, our method shows significant performance improvement over KenLM[3] as well.
Date of Conference: 30 January 2019 - 01 February 2019
Date Added to IEEE Xplore: 14 March 2019
ISBN Information:
Print on Demand(PoD) ISSN: 2325-6516
Conference Location: Newport Beach, CA, USA

Contact IEEE to Subscribe

References

References is not available for this document.