Skip to main content

Accelerating the Support of Conversational Interfaces for RPAs Through APIs

  • Conference paper
  • First Online:
Business Process Management: Blockchain, Robotic Process Automation and Educators Forum (BPM 2023)

Part of the book series: Lecture Notes in Business Information Processing ((LNBIP,volume 491))

Included in the following conference series:

  • 960 Accesses

Abstract

In the business automation world, APIs are everywhere. They provide access to enterprise tools such as customer relationship management solutions, or custom automations such as unattended RPA bots that automate repetitive tasks. Unfortunately, they may not be accessible to the business users who need them but are not equipped with the necessary technical skills to leverage them. Most recently, chatbots are becoming the go-to medium to make automation software accessible to business users. Since API specifications aren’t written with a chatbot use in mind, additional work is needed to make APIs accessible through a natural language interface. Making this process scalable to many APIs requires an automated data training pipeline for intent recognition models, a crucial component within chatbots to understand natural language utterances from users. More accurate intent recognition models lead to better user experience and satisfaction. Prior work proposed approaches to extracting intents from OpenAPI specifications. However, the resulting models tend to be brittle due to weaknesses in training data. In this work, we propose a data augmentation approach based on paraphrasing using large language models and propose a system to generate sentences to train intent recognition models. Experimental results highlight the effectiveness of our approach. Our system is deployed in a real world setting.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The module can be flexibly expanded with additional models as needed.

  2. 2.

    https://quoradata.quora.com/First-Quora-Dataset-Release-Question-Pairs.

  3. 3.

    Experiments with other similarity metrics such as USE, ROUGE, and BLEU yielded similar results.

  4. 4.

    http://johncreid.com/wp-content/uploads/2014/12/The-Era-of-Cognitive-Systems-An-Inside-Look-at-IBM-Watson-and-How-it-Works_.pdf.

  5. 5.

    Parrot [39] is a paraphrase based phrase augmentation framework specialized in training natural language understanding models. It is based on the HuggingFace T5 library and is in the top 3 downloaded paraphrasing generation models on the website.

References

  1. Laurent, P., Chollet, T., Herzberg, E.: Intelligent automation entering the business world. Deloitte (2015). https://www2.deloitte.com/content/dam/Deloitte/lu/Documents/operations/lu-intelligent-automationbusiness-world.pdf. Accessed 5 Mar 2018

  2. da Silva Costa, D.A., São Mamede, H., da Silva, M.M.: Robotic process automation (RPA) adoption: a systematic literature review. Eng. Manage. Prod. Serv. 14(2), 1–12 (2022)

    Google Scholar 

  3. Průcha, P., Skrbek, J.: API as Method for improving robotic process automation. In: Marrella, A., et al. (eds.) Business Process Management: Blockchain, Robotic Process Automation, and Central and Eastern Europe Forum. BPM 2022. Lecture Notes in Business Information Processing, vol. 459, pp. 260–273. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-16168-1_17

  4. Rizk, Y., et al.: A conversational digital assistant for intelligent process automation. In: Business Process Management: Blockchain and Robotic Process Automation Forum: BPM 2020 Blockchain and RPA Forum, Seville, Spain, September 13–18, 2020, Proceedings 18, pp. 85–100. Springer (2020). https://doi.org/10.1007/978-3-030-58779-6_6

  5. Wang, X., et al: Interactive data analysis with next-step natural language query recommendation. arXiv preprint arXiv:2201.04868 (2022)

  6. Behera, R.K., Bala, P.K., Ray, A.: Cognitive chatbot for personalised contextual customer service: behind the scene and beyond the hype. Inf. Syst. Front. 1–21 (2021). https://doi.org/10.1007/s10796-021-10168-y

  7. Kenton, J.D., Ming-Wei, C., Toutanova, L.K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of NAACL-HLT, pp. 4171–4186 (2019)

    Google Scholar 

  8. Brown, T., et al.: Language models are few-shot learners. Adv. Neural. Inf. Process. Syst. 33, 1877–1901 (2020)

    Google Scholar 

  9. Vaziri, M., Mandel, L., Shinnar, A., Siméon, J., Hirzel, M.: Generating chat bots from web API specifications. In: Proceedings of the 2017 ACM SIGPLAN International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software, pp. 44–57 (2017)

    Google Scholar 

  10. Babkin, P., Chowdhury, M.F.M., Gliozzo, A., Hirzel, M., Shinnar, A.: Bootstrapping chatbots for novel domains. In: Workshop at NIPS on Learning with Limited Labeled Data (2017)

    Google Scholar 

  11. Feng, S.Y., et al.: A survey of data augmentation approaches for NLP. arXiv:2105.03075 (2021)

  12. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9, 1735–1780 (1997)

    Article  Google Scholar 

  13. Vaswani, A., et al.: Attention is all you need. arXiv:1706.03762 (2017)

  14. Peters, M.E., et al.: Deep contextualized word representations. In: NAACL (2018)

    Google Scholar 

  15. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805 (2019)

  16. Brown, T.B., Mann, B., Ryder, N.: Language models are few-shot learners. arXiv:2005.14165 (2020)

  17. Raffel, C., et al.: Exploring the limits of transfer learning with a unified text-to-text transformer. arXiv:1910.10683 (2020)

  18. Gao, S., Zhang, Y., Ou, Z., Yu, Z.: Paraphrase augmented task-oriented dialog generation. In: ACL (2020)

    Google Scholar 

  19. Kumar, A., Bhattamishra, S., Bhandari, M., Talukdar, P.P.: Submodular optimization-based diverse paraphrasing and its effectiveness in data augmentation. In: NAACL (2019)

    Google Scholar 

  20. Simsek, U., Fensel, D.A.: Intent generation for goal-oriented dialogue systems based on schema.org annotations. arXiv:1807.01292 (2018)

  21. Banarescu, L., et al.: Abstract meaning representation for Sembanking. In: 7th linguistic annotation workshop and interoperability with discourse (2013)

    Google Scholar 

  22. Jurafsky, D., Martin., J.H.: Dependency parsing. In Speech and Language Processing (3 ed.). Draft, Chapter 14. https://web.stanford.edu/jurafsky/slp3/14.pdf (2017)

  23. Zhang, Y., Baldridge, J., He, L.: Paws: paraphrase adversaries from word scrambling. arXiv:1904.01130 (2019)

  24. Wieting, J., Gimpel, K.: ParaNMT-50M: pushing the limits of paraphrastic sentence embeddings with millions of machine translations. In: ACL (2018)

    Google Scholar 

  25. Sennrich, R.: Improving neural machine translation models with monolingual data. arXiv:1511.06709 (2016)

  26. Bevilacqua, M.: One spring to rule them both: symmetric AMR semantic parsing and generation without a complex pipeline. In: AAAI (2021)

    Google Scholar 

  27. Fan, A., Lewis, M., Dauphin, Y.: Hierarchical neural story generation. In: ACL (2018)

    Google Scholar 

  28. Lewis, M.: Bart: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. arXiv:1910.13461 (2020)

  29. Gu, J.C., Tao, C., Ling, Z., Xu, C., Geng, X., Jiang, D.: MPC-BERT: a pre-trained language model for multi-party conversation understanding. In: ACL/IJCNLP (2021)

    Google Scholar 

  30. Yang, Y., et al.: Generative data augmentation for commonsense reasoning. arXiv: Computation and Language (2020)

    Google Scholar 

  31. Bocklisch, T., Faulkner, J., Pawlowski, N., Nichol, A.: Rasa: open source language understanding and dialogue management. arXiv preprint arXiv:1712.05181 (2017)

  32. Qi, H.: Benchmarking commercial intent detection services with practice-driven evaluations. In: NAACL (2021)

    Google Scholar 

  33. Liu, X.: Benchmarking natural language understanding services for building conversational agents. In: IWSDS (2019)

    Google Scholar 

  34. Casanueva, I., Temvcinas, T., Gerz, D., Henderson, M., Vulic, I.: Efficient intent detection with dual sentence encoders. arXiv:2003.04807 (2020)

  35. Larson, S., et al.: An evaluation dataset for intent classification and out-of-scope prediction. arXiv:1909.02027 (2019)

  36. Cer, D.M., et al.: Universal sentence encoder. arXiv:1803.11175 (2018)

  37. Campagna, G., Foryciarz, A., Moradshahi, M., Lam, M.S.: Zero-shot transfer learning with synthesized data for multi-domain dialogue state tracking. In: ACL (2020)

    Google Scholar 

  38. Zhong, V., Lewis, M., Wang, S.I., Zettlemoyer, L.: Grounded adaptation for zero-shot executable semantic parsing. In: EMNLP (2020)

    Google Scholar 

  39. Damodaran, P.: Parrot: Paraphrase generation for NLU. https://github.com/PrithivirajDamodaran/Parrot_Paraphraser (2021)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Siyu Huo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Huo, S., Mukherjee, K., Bandlamudi, J., Isahagian, V., Muthusamy, V., Rizk, Y. (2023). Accelerating the Support of Conversational Interfaces for RPAs Through APIs. In: Köpke, J., et al. Business Process Management: Blockchain, Robotic Process Automation and Educators Forum. BPM 2023. Lecture Notes in Business Information Processing, vol 491. Springer, Cham. https://doi.org/10.1007/978-3-031-43433-4_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-43433-4_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-43432-7

  • Online ISBN: 978-3-031-43433-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics