Loading [a11y]/accessibility-menu.js
Rating Ease of Readability using Transformers | IEEE Conference Publication | IEEE Xplore

Rating Ease of Readability using Transformers


Abstract:

Understanding and rating text complexity accurately can have a considerable impact on learning and education. In the past few decades, educators used traditional readabil...Show More

Abstract:

Understanding and rating text complexity accurately can have a considerable impact on learning and education. In the past few decades, educators used traditional readability formulas to match texts with the readability level of students. This tends to oversimplify the different dimensions of text difficulty. Presently, transformer-based-language models have brought field of Natural Language Processing to a new era by understanding the text in a better way and achieving great success on many tasks. In this study, we assess the effectiveness of different pre-trained transformers on rating ease of readability. We propose a model built on top of the pre-trained Roberta transformer with weighted pooling, which uses multiple hidden states information effectively to do this task more accurately. Our experiments are done on a Dataset of English excerpts annotated by language experts which is extracted from Kaggle. On this Dataset, Our proposed model achieved 71% improvement over the traditional Flesch formula and a significant boost over other Transformer models and Long Short Term Memory(LSTM).
Date of Conference: 25-27 March 2022
Date Added to IEEE Xplore: 26 April 2022
ISBN Information:
Conference Location: Brisbane, Australia

References

References is not available for this document.