Next location prediction using transformers
by Salah Eddine Henouda; Fatima Zohra Laallam; Okba Kazar; Abdessamed Sassi
International Journal of Business Intelligence and Data Mining (IJBIDM), Vol. 21, No. 2, 2022

Abstract: This work seeks to solve the next location prediction problem of mobile users. Chiefly, we focus on ROBERTA architecture (robustly optimised BERT approach) in order to build a next location prediction model through the use of a subset of a large real mobility trace database. The latter was made available to the public through the CRAWDAD project. ROBERTA, which is a well-known model in natural language processing (NLP), works intentionally on predicting hidden sections of text based on language masking strategy. The current paper follows a similar architecture as ROBERTA and proposes a new combination of Bertwordpiece tokeniser and ROBERTA for location prediction that we call WP-BERTA. The results demonstrated that our proposed model WP-BERTA outperformed the state-of-the-art models. They also indicated that the proposed model provided a significant improvement in the next location prediction accuracy compared to the state-of-the-art models. We particularly revealed that WP-BERTA outperformed Markovian models, support vector machine (SVM), convolutional neural networks (CNNs), and long short-term memory networks (LSTMs).

Online publication date: Thu, 11-Aug-2022

The full text of this article is only available to individual subscribers or to users at subscribing institutions.

 
Existing subscribers:
Go to Inderscience Online Journals to access the Full Text of this article.

Pay per view:
If you are not a subscriber and you just want to read the full contents of this article, buy online access here.

Complimentary Subscribers, Editors or Members of the Editorial Board of the International Journal of Business Intelligence and Data Mining (IJBIDM):
Login with your Inderscience username and password:

    Username:        Password:         

Forgotten your password?


Want to subscribe?
A subscription gives you complete access to all articles in the current issue, as well as to all articles in the previous three years (where applicable). See our Orders page to subscribe.

If you still need assistance, please email subs@inderscience.com