Abstract
The State Grid Enterprise Public Data Model (SG-CIM 4.0) is a semantically unified data model that can be provided for smart grid business applications. The model is based on a standard table to find similar entities in the physical model for consistency checking. The standard table entities and physical model entities contain both continuous attributes and discrete attributes. How to accurately calculate the similarity of these different attributes and fuse them into a unique similarity, which can be used to efficiently and accurately mine the entity pairs with the highest similarity, are problems to be solved. In order to solve the above problems and make this similarity calculation and fusion method scalable, this paper calculates the syntactic similarity of continuous attributes, semantic similarity, and discrete attribute similarity to the content of different attributes in the entity and introduces a NAS (Neural Architecture Search) based on these similarities Similarity Fusion Neural Network automatically designed a method to achieve the fusion of similarity, which will be called SFNAS (Similarity Fusion NAS). The neural network fusion similarity calculated using SFNAS is better than the traditional linear weighted average similarity in terms of entity pair matching hit rate. This paper can provide useful references for subsequent research on SG-CIM models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Lu, R., Jin, X., Zhang, S., Qiu, M., Wu, X.: A study on big knowledge and its engineering issues. IEEE Trans. Knowl. Data Eng. 31(9), 1630–1644 (2018)
Qiu, M., Cao, D., et al.: Data transfer minimization for financial derivative pricing using Monte Carlo simulation with GPU in 5G. J. Commun. Syst. 29(16), 2364–2374 (2016)
Qiu, H., Qiu, M., Lu, Z.: Selective encryption on ECG data in body sensor network based on supervised machine learning. Inf. Fusion 55, 59–67 (2020)
Qiu, M., Khisamutdinov, E., et al.: RNA nanotechnology for computer design and in vivo computation. Philos. Trans. R. Soc. A (2013)
Qiu, M., Xue, C., Shao, Z., Sha, E.: Energy minimization with soft real-time and DVS for uniprocessor and multiprocessor embedded systems. In: IEEE DATE Conference, pp. 1–6 (2007)
Qiu, L., Gai, K., Qiu, M.: Optimal big data sharing approach for tele-health in cloud computing. In: IEEE SmartCloud, pp. 184–189 (2016)
Wan, Q., Wang, S., He, X.: Application method of Taiwan SG-CIM model in data. Telecommun. Sci. 36(03), 140–147 (2020)
Wang, J., Wang, J.: Design and implementation of public data model for electric power enterprises based on IEC standard. China Electr. Power 44(2), 87–90 (2011)
Shenming, Z., Guoding, L.: Introduction of standard IEC 61970. Autom. Electr. Power Syst. 14, 1–6 (2002)
Uslar, M., Specht, M., Rohjans, S., Trefke, J., González, J.M.: The Common Information Model CIM: IEC 61968/61970 and 62325-A Practical Introduction to the CIM. Springer, Heidelberg (2012)
Rolia, J., Casale, G., Krishnamurthy, D., Dawson, S., Kraft, S.: Predictive modelling of SAP ERP applications: challenges and solutions. In: ICST Conference on Performance Evaluation Methodologies and Tools, pp. 1–9, October 2009
Gomaa, W.H., Fahmy, A.A.: A survey of text similarity approaches. Int. J. Comput. Appl. 68(13), 13–18 (2013). https://doi.org/10.5120/11638-7118
Yujian, L., Bo, L.: A normalized Levenshtein distance metric. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1091–1095 (2007)
Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)
Hamers, L.: Similarity measures in scientometric research: the Jaccard index versus Salton’s cosine formula. Inf. Process. Manag. 25(3), 315–318 (1989)
Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search: a survey. J. Mach. Learn. Res. 20(1), 1997–2017 (2019)
Zoph, B., Le, Q.V.: Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578 (2016)
Kramer, O.: Genetic algorithms. In: Kramer, O. (ed.) Genetic Algorithm Essentials, vol. 679, pp. 11–19. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-52156-5_2
Liu, H., Simonyan, K., Yang, Y.: DARTS: differentiable architecture search. arXiv preprint arXiv:1806.09055 (2018)
Marzal, A., Vidal, E.: Computation of normalized edit distance and applications. IEEE Trans. Pattern Anal. Mach. Intell. 15(9), 926–932 (1993)
Reimers, N., Gurevych, I.: Sentence-BERT: sentence embeddings using siamese BERT-networks. arXiv preprint arXiv:1908.10084 (2019)
Niwattanakul, S., Singthongchai, J., Naenudorn, E., Wanapu, S.: Using of Jaccard coefficient for keywords similarity. In: International Multiconference of Engineers and Computer Scientists, vol. 1, no. 6, pp. 380–384, March 2013
Stanimirovic, I.P., Zlatanovic, M.L., Petkovic, M.D.: On the linear weighted sum method for multi-objective optimization. Facta Acta Univ. 26(4), 49–63 (2011)
Wistuba, M., Rawat, A., Pedapati, T.: A survey on neural architecture search. arXiv preprint arXiv:1905.01392 (2019)
Graves, A.: Long short-term memory. In: Graves, A. (ed.) Supervised Sequence Labelling with Recurrent Neural Networks, pp. 37–45. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-24797-2_4
Bottou, L.: Stochastic gradient descent tricks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, pp. 421–436. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-35289-8_25
Bottou, L.: Large-scale machine learning with stochastic gradient descent. In: Proceedings of COMPSTAT’2010, pp. 177–186. Physica-Verlag HD (2010)
Karp, R.M., Luby, M., Madras, N.: Monte-Carlo approximation algorithms for enumeration problems. J. Algorithms 10(3), 429–448 (1989)
Veness, J., Ng, K.S., Hutter, M., Uther, W., Silver, D.: A Monte-Carlo AIXI approximation. J. Artif. Intell. Res. 40, 95–142 (2011)
Wong, C., Houlsby, N., Lu, Y., Gesmundo, A.: Transfer learning with neural automl. arXiv preprint arXiv:1803.02780 (2018)
Acknowledgment
This work financially supported by Science and Technology Program of State Grid Corporation of China under Grant No.: 5211DS21000T.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Liao, X. et al. (2022). An Automatic Design Method of Similarity Fusion Neural Network Based on SG-CIM Model. In: Qiu, M., Gai, K., Qiu, H. (eds) Smart Computing and Communication. SmartCom 2021. Lecture Notes in Computer Science, vol 13202. Springer, Cham. https://doi.org/10.1007/978-3-030-97774-0_32
Download citation
DOI: https://doi.org/10.1007/978-3-030-97774-0_32
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-97773-3
Online ISBN: 978-3-030-97774-0
eBook Packages: Computer ScienceComputer Science (R0)