Loading [a11y]/accessibility-menu.js
Improving Log-Based Anomaly Detection by Pre-Training Hierarchical Transformers | IEEE Journals & Magazine | IEEE Xplore

Improving Log-Based Anomaly Detection by Pre-Training Hierarchical Transformers


Abstract:

Pre-trained models, such as BERT, have resulted in significant pre-trained models, such as BERT, have resulted in significant improvements in many natural language proces...Show More

Abstract:

Pre-trained models, such as BERT, have resulted in significant pre-trained models, such as BERT, have resulted in significant improvements in many natural language processing (NLP) applications. However, due to differences in word distribution and domain data distribution, applying NLP advancements to log analysis directly faces some performance challenges. This paper studies how to adapt the recently introduced pre-trained language model BERT for log analysis. In this work, we propose a pre-trained log representation model with hierarchical bidirectional encoder transformers (namely, HilBERT). Unlike previous work, which used raw text as pre-training data, we parse logs into templates before using the log templates to pre-train HilBERT. We also design a hierarchical transformers model to capture log template sequence-level information. We use log-based anomaly detection for downstream tasks and fine-tune our model with different log data. Our experiments demonstrate that HilBERT outperforms other baseline techniques on unstable log data. While BERT obtains performance comparable to that of previous state-of-the-art models, HilBERT can significantly address the problem of log instability and achieve accurate and robust results.
Published in: IEEE Transactions on Computers ( Volume: 72, Issue: 9, 01 September 2023)
Page(s): 2656 - 2667
Date of Publication: 15 March 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.