Abstract
Artificial Intelligence (AI) is a key driver of the Industry 4.0 revolution. In industrial automation systems, data points of assets are represented by globally unique identifiers known as “Tags,” which often contain abbreviated asset and attribute information. These abbreviations need translation into concrete names to map data points to their corresponding assets. In this paper, we introduce DELA (Dual Embedding using LSTM and Attention), an innovative deep learning approach that uses two neural networks to classify “Attribute” and “Asset” for tag-to-asset mapping. The models are trained on real-world industrial standard datasets from the automation industry. To evaluate the generalization of our models, our experiments included a testing dataset with numerous abbreviations not present in the training set. This setup ensures that DELA can handle data with uncommon naming conventions. Our extensive experiments show that DELA efficiently achieves surpassing performance over current state-of-the-art approaches.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Nazir, S., Patel, S., Patel, D.: Autoencoder based anomaly detection for SCADA networks. Int. J. Artif. Intell. Mach. Learn. 11, 17 (2021)
Teoh, Y., Gill, S., Parlikad, A.: IOT and Fog computing based predictive maintenance model for effective asset management in Industry 4.0 using machine learning. IEEE Internet Things J. 10, 2087–2094 (2021)
Li, C., Ma, T., Zhou, Y., Cheng, J., Xu, B.: Measuring word semantic similarity based on transferred vectors. In: International Conference on Neural Information Processing, pp. 326–335 (2017)
Xu, H., Wu, Y., Elhadad, N., Stetson, P., Friedman, C.: A new clustering method for detecting rare senses of abbreviations in clinical notes. J. Biomed. Inform. 45(6), 1075–1083 (2012)
Zhang, C., Biś, D., Liu, X., He, Z.: Biomedical word sense disambiguation with bidirectional long short-term memory and attention-based neural networks. BMC Bioinf. 20(1), 502 (2019)
Zhang, C., Pang, S., Gao, X., Liu, J., Yu, B.: Attention neural network for biomedical word sense disambiguation. Discret. Dyn. Nat. Soc. 2022, 1–14 (2022)
ISO 14224:2016, ISO - International Organization for Standardization. https://www.iso.org/standard/64076.html. Accessed 25 Oct 2023
ISO 14617-1:2005, ISO - International Organization for Standardization. https://www.iso.org/standard/41838.html. Accessed 25 Oct 2023
IEC 61850:2022 SER Series, IEC - International Electrotechnical Commission. https://webstore.iec.ch/publication/6028. Accessed 25 Oct 2023
Sokolova, M., Lapalme, G.: A systematic analysis of performance measures for classification tasks. Info. Process. Manag. 45(4), 427–437 (2009)
Bohnet, B., McDonald, R., Simões, G., Andor, D., Pitler, E., Maynez, J.: Morphosyntactic tagging with a meta-BiLSTM model over context sensitive token encodings. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 2642–2652 (2018)
Vaswani, A., et al.: Attention is all you need. Adv. Neural Info. Process. Syst. 30 (2017)
Xu, G., Meng, Y., Qiu, X., Yu, Z., Wu, X.: Sentiment analysis of comment texts based on BiLSTM. IEEE Access 7, 51522–51532 (2019)
Rusiecki, A.: Trimmed categorical cross-entropy for deep learning with label noise. Electron. Lett. 55(6), 319–320 (2019)
Patel, R., Domeniconi, C.: Estimator vectors: OOV word embeddings based on subword and context clue estimates. In: 2020 International Joint Conference on Neural Networks (IJCNN), pp. 1–8 (2020)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: 3rd International Conference for Learning Representations (2015)
Schuster, M., Paliwal, K.K.: Bidirectional recurrent neural networks. IEEE Trans. Sig. Process. 45(11), 2673–2681 (1997)
Yadav, G., Paul, K.: Architecture and security of SCADA systems: a review. In: Proceedings of the Khosla School of Information Technology, IIT Delhi, India, pp. 1–10 (2024)
Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186, Minneapolis, Minnesota. Association for Computational Linguistics (2019)
Chollet, F.: Keras. In: Deep Learning with Python, pp. 301–304. Manning Publications Co. (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zhao, Z., Erickson, B.K., Chakraborty, S., Liu, W. (2025). DELA: Dual Embedding Using LSTM and Attention for Asset Tag Inference in Industrial Automation Systems. In: Gong, M., Song, Y., Koh, Y.S., Xiang, W., Wang, D. (eds) AI 2024: Advances in Artificial Intelligence. AI 2024. Lecture Notes in Computer Science(), vol 15442. Springer, Singapore. https://doi.org/10.1007/978-981-96-0348-0_1
Download citation
DOI: https://doi.org/10.1007/978-981-96-0348-0_1
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-96-0347-3
Online ISBN: 978-981-96-0348-0
eBook Packages: Computer ScienceComputer Science (R0)