Transformer-Based Models for Named Entity Recognition: A Comparative Study | IEEE Conference Publication | IEEE Xplore

Transformer-Based Models for Named Entity Recognition: A Comparative Study


Abstract:

NER is a critical problem in natural language processing (NLP) that includes detecting and classifying named entities in text. We give a complete investigation on NER usi...Show More

Abstract:

NER is a critical problem in natural language processing (NLP) that includes detecting and classifying named entities in text. We give a complete investigation on NER using the CoNLL dataset in this research article. We explore the evolution of NER techniques, from traditional rule-based methods to state-of-the-art transformer-based models. Our experiments focus on evaluating the performance of three popular models: BERT, ALBERT, and XLM-RoBERTa, on the CoNLL dataset. We analyze their accuracies and discuss their strengths and limitations. The results demonstrate the effectiveness of transformer-based models, with BERT achieving an accuracy of 99.50%, ALBERT achieving 96.90% accuracy, and XLM-RoBERTa achieving 87.82% accuracy. We discuss the importance of dataset preprocessing, model fine-tuning, and hyperparameter optimization for achieving optimal performance. This research contributes to the understanding of NER techniques and provides insights into the performance of transformer-based models for NER tasks.
Date of Conference: 06-08 July 2023
Date Added to IEEE Xplore: 23 November 2023
ISBN Information:

ISSN Information:

Conference Location: Delhi, India

Contact IEEE to Subscribe

References

References is not available for this document.