Abstract:
The emergence of pre-trained models based on deep learning has considerably enhanced the development of many applications, such as chatbots. These models can be refined f...Show MoreMetadata
Abstract:
The emergence of pre-trained models based on deep learning has considerably enhanced the development of many applications, such as chatbots. These models can be refined for specific tasks to improve chatbot accuracy. The core of the chatbot is its ability to understand the user’s intent through its Natural Language Understanding (NLU) component. Within NLU, intent classification is a central task. Recently, transformer models have revolutionized the resolution of this task by capturing the semantic relations between words in a sentence. This article presents a comparative study and critical analysis of four transformer models, which are Bert, Albert, Roberta, and Gpt2, to identify which offers the best accuracy for an existing dataset for the intent classification task.
Published in: 2023 6th International Conference on Advanced Communication Technologies and Networking (CommNet)
Date of Conference: 11-13 December 2023
Date Added to IEEE Xplore: 22 December 2023
ISBN Information: