skip to main content
10.1145/3390557.3394312acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiciaiConference Proceedingsconference-collections
research-article

Research on Intelligent Robot Engine of Electric Power Online Customer Services Based on Knowledge Graph

Authors Info & Claims
Published:04 June 2020Publication History

ABSTRACT

The power grid enterprise has established various electronic service channels to provide better services to the electric power customers, whose online requests are handled by the customer service personnel. The paper proposes the design of the intelligent robot of online customer services by adopting a variety of mature artificial intelligence technologies to enable more convenient and better service experience to the customers. The functional design of the intelligent robot engine is described first with business guidance rule management, engine support, knowledge graph support, service channel support, and operation management support, which could be the essential component of the collaborative channel management platform in the online channel management system. The working mechanism of the intelligent robot engine is elaborated, where the workflow is illustrated to provide the automatic online customer service and the corresponding business guidance rules are described, taking the ones for the electricity quantity inquiry as example. The electric power knowledge graph is established to locate the keywords of the electric power business consultation in the business request from the customers where the deep bidirectional LSTM-CRF model is adopted with word embeddings. The advantages of the intelligent robot engine are analyzed respectively from perspectives of the customers, the personnel and the power grid enterprise, the key points of developing and improving the engine in the future are finally discussed.

References

  1. Lin, H., Fang, X. M., Yuan, B., Ouyang, H., 2019. Research and design of business platform strategy for multi-channel customer services in internet of things in electricity. Distribution & Utilization, 6, pp 39--45.Google ScholarGoogle Scholar
  2. Zhao, G. D, Zhang, C. J., Ouyang, H., Hong, Y., 2019. Research on architecture design of multi-channel operation support platform based on business middle platform. Distribution & Utilization, 6, pp 67--71.Google ScholarGoogle Scholar
  3. Pu, X. C., 2013. Electric power marketing business and management. Electric Power Press.Google ScholarGoogle Scholar
  4. Liu, Q., 2015. Power market marketing management (3rd Edition). Electric Power Press.Google ScholarGoogle Scholar
  5. Qiu, T. Q., 2018. Internet plus power marketing service. Shanghai University of Finance and Economics Press.Google ScholarGoogle Scholar
  6. Tang, W. S., Ouyang, H., Liu, Y. X., Wang, F. E., 2014. Design and implementation of electric power marketing general data platform based on IEC61970/61968. Electric Power Information and Communication Technology, 10, pp 89--94.Google ScholarGoogle Scholar
  7. Lin, H., Zhao, J. K., Jiao, Y., Cao, J., Ouyang, H., Fang, H. W., Yuan, B., 2017. Research on building an innovative electric power marketing business application system based on cloud computing and microservices architecture technologies. IEEE 2nd International Conference on Cloud Computing and Big Data Analysis, pp 246--253.Google ScholarGoogle Scholar
  8. Han, X. P., Liu, K., Hou, L., 2018. Knowledge graph development report. Language and Knowledge Computing of Chinese Information Processing Society of China.Google ScholarGoogle Scholar
  9. Eddy, S. R., 1996. Hidden markov models. Current opinion in structural biology, 6 (3), pp 361--365.Google ScholarGoogle Scholar
  10. Kapur, J. N., 1989. Maximum-entropy models in science and engineering. John Wiley & Sons.Google ScholarGoogle Scholar
  11. Lafferty, J., McCallum, A., Pereira, F. C. 2001. Conditional random fields: Probabilistic models for segmenting and labeling sequence data. Proceedings of International Conference on Machine Learning, pp 282--289.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Huang, Z., Xu, W., Yu, K., 2015. Bidirectional LSTM-CRF models for sequence tagging. arXiv:1508.01991.Google ScholarGoogle Scholar
  13. Lample, G., Ballesteros, M., Subramanian, S., 2016. Neural Architectures for Named Entity Recognition. Proceedings of Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp 260--270.Google ScholarGoogle ScholarCross RefCross Ref
  14. Chiu, J. P., Nichols, E., 2016. Named entity recognition with bidirectional lstm-cnns. Transactions of the Association for Computational Linguistics, pp 357--370.Google ScholarGoogle ScholarCross RefCross Ref
  15. Zhai, F., Potdar, S., Xiang, B., Zhou, B., 2017. Neural models for sequence chunking. Proceedings of Association for the Advance of Artificial Intelligence, pp 3365--3371.Google ScholarGoogle ScholarCross RefCross Ref
  16. Mikolov, T., Chen, K., Corrado, G., Dean, J., 2013. Efficient estimation of word representations in vector space. Proceedings of the International Conference on Learning Representations, pp 1--12.Google ScholarGoogle Scholar
  17. Word2vec, https://code.google.com/archive/p/word2vec/Google ScholarGoogle Scholar
  18. Goodfellow, I., Bengio, Y., Courville, A., 2017. Deep Learning. People's Post and Telecommunications Press, pp 248--250.Google ScholarGoogle Scholar
  19. Zheng, S., Wang, F., Bao, H., Hao, Y., Zhou, P., Xu, B., 2017. Joint extraction of entities and relations based on a novel tagging scheme. Proceedings of Association for Computational Linguistics, pp 1227--1236.Google ScholarGoogle ScholarCross RefCross Ref
  20. Zhou, P., Zheng, S., Xu, J., Qi, Z., Bao, H., Xu, B., 2017. Joint extraction of multiple relations and entities by using a hybrid neural network. Proceedings of Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data, Springer, pp 135--146.Google ScholarGoogle ScholarCross RefCross Ref
  21. Devlin, J., Chang, M. W., Lee, K., Toutanova, K., 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805.Google ScholarGoogle Scholar
  22. Yang, Z. L., Dai, Z. H., Yang, Y. M., Carbonell, J., Salakhutdinov, R., 2019. XLNet: Generalized autoregressive pretraining for language understanding. arXiv:1906.08237.Google ScholarGoogle Scholar

Index Terms

  1. Research on Intelligent Robot Engine of Electric Power Online Customer Services Based on Knowledge Graph

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      ICIAI '20: Proceedings of the 2020 the 4th International Conference on Innovation in Artificial Intelligence
      May 2020
      271 pages
      ISBN:9781450376587
      DOI:10.1145/3390557

      Copyright © 2020 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 4 June 2020

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader