Abstract
This paper aims to utilize AI technology to solve the challenge of medicine compliance check. More specifically, we propose a Logic-BERT model to estimate whether certain medicine can be used in specific situations of a patient based on electronic medical record. We design a sentence level architecture that distill the text content by segmentation, selection and recombination to solve the length limitation of bidirectional encoder representations from transformers (BERT). We also apply data augmentation integrating logic rules to enhance the performance of our proposed model. Experiments based on real data have verified the effectiveness of our model.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Devlin, J., Chang, M.W., Lee, K., et al.: BERT: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Armstrong, N.: Overdiagnosis and overtreatment as a quality problem: insights from healthcare improvement research. BMJ Qual. Saf. 27(7), 571–575 (2018)
Lyu, H., Xu, T., Brotman, D., et al.: Overtreatment in the United States. PLoS One, 12(9). e0181970 (2017)
Lenzer, J.: Experts consider how to tackle overtreatment in US healthcare (2012)
Heijnsdijk, E.A.M., der Kinderen, A., Wever, E.M., et al.: Overdetection, overtreatment and costs in prostate-specific antigen screening for prostate cancer. Br. J. Cancer 101(11), 1833–1838 (2009)
McCoy, R.G., Van Houten, H.K., Ross, J.S., et al.: HbA1c overtesting and overtreatment among US adults with controlled type 2 diabetes, 2001–13: observational population based study. BMJ 351, h6138 (2015)
Orish, V.N., Ansong, J.Y., Onyeabor, O.S., et al.: Overdiagnosis and overtreatment of malaria in children in a secondary healthcare centre in Sekondi-Takoradi. Ghana. Trop. Doc. 46(4), 191–198 (2016)
Sun, W., Cai, Z., Li, Y., et al.: Data processing and text mining technologies on electronic medical records: a review. J. Healthc. Eng. (2018)
Zhou, X., Han, H., Chankai, I., et al.: Approaches to text mining for clinical medical records. In: Proceedings of the 2006 ACM Symposium on Applied Computing, pp. 235–239 (2006)
Erhan, D., Courville, A., Bengio, Y., et al.: Why does unsupervised pre-training help deep learning? In: Proceedings of the Thirteenth International Conference on artificial Intelligence and Statistics. JMLR Workshop and Conference Proceedings, pp. 201–208 (2010)
Dai, A.M., Le, Q.V.: Semi-supervised sequence learning. arXiv preprint arXiv:1511.01432 (2015)
Peters, M E., Neumann, M., Iyyer, M., et al.: Deep contextualized word representations. arXiv preprint arXiv:1802.05365 (2018)
Radford, A., Narasimhan, K., Salimans, T., et al.: Improving language understanding with unsupervised learning (2018)
Sun, F., Liu, J., Wu, J., Pei, C., Lin, X., Ou, W., Jiang, P.: BERT4Rec: sequential recommendation with bidirectional encoder representations from transformer. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management, pp. 1441–1450 (2019)
Chao, G.L., Lane, I.: BERT-DST: scalable end-to-end dialogue state tracking with bidirectional encoder representations from transformer. arXiv preprint arXiv:1907.03040 (2019)
Laskar, M.T.R., Hoque, E., Huang, J.X.: Utilizing bidirectional encoder representations from transformers for answer selection. arXiv preprint arXiv:2011.07208 (2020)
Li, F., Jin, Y., Liu, W., Rawat, B.P.S., Cai, P., Yu, H.: Fine-tuning bidirectional encoder representations from transformers (BERT)–based models on large-scale electronic health record notes: an empirical study. JMIR Med. Inf. 7(3), e14830 (2019)
Xu, D., Gopale, M., Zhang, J., Brown, K., Begoli, E., Bethard, S.: Unified medical language system resources improve sieve-based generation and Bidirectional encoder representations from transformers (BERT)–based ranking for concept normalization. J. Am. Med. Inform. Assoc. 27(10), 1510–1519 (2020)
Ding, M., Zhou, C., Yang, H., Tang, J.: CogLTX: Applying BERT to long texts. advances in neural information processing systems, vol. 33 (2020)
Wang, Z., Ng, P., Ma, X., Nallapati, R., Xiang, B.: Multi-passage BERT: a globally normalized bert model for open-domain question answering. arXiv preprint arXiv:1908.08167 (2019)
Wang, W., Yan, M., Wu, C.: Multi-granularity hierarchical attention fusion networks for reading comprehension and question answering. arXiv preprint arXiv:1811.11934 (2018)
Pappagari, R., Zelasko, P., Villalba, J., Carmiel, Y., Dehak, N.: Hierarchical Transformers for Long Document Classification. In 2019 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU) (pp. 838–844). IEEE(2019).
Rae, J.W., Potapenko, A., Jayakumar, S.M., Lillicrap, T.P.: Compressive transformers for long-range sequence modelling. arXiv preprint arXiv:1911.05507 (2019)
Qiu, J., Ma, H., Levy, O., Yih, S.W.T., Wang, S., Tang, J.: Blockwise self-attention for long document understanding. arXiv preprint arXiv:1911.02972 (2019)
Zhang, R., Wei, Z., Shi, Y., Chen, Y.: BERT-AL: BERT for arbitrarily long document understanding (2019)
Ding, M., Zhou, C., Yang, H., Tang, J.: CogLTX: applying BERT to long texts. advances in neural information processing systems, vol. 33 (2020)
Sutskever, I., Vinyals, O., Le, Q.V.: Sequence to sequence learning with neural networks. Adv. Neural. Inf. Process. Syst. 27, 3104–3112 (2014)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. Commun. ACM 60(6), 84–90 (2017)
Socher, R., Perelygin, A., Wu, J., Chuang, J., Manning, C. D., Ng, A. Y., Potts, C.: Recursive deep models for semantic compositionality over a sentiment treebank. In Proceedings of the 2013 conference on empirical methods in natural language processing, pp. 1631–1642(2013).
Wang, D., Liu, P., Zheng, Y., Qiu, X., Huang, X.: Heterogeneous graph neural networks for extractive document summarization. arXiv preprint arXiv:2004.12393 (2020)
Perez, L., Wang, J.: The effectiveness of data augmentation in image classification using deep learning. arXiv preprint arXiv:1712.04621 (2017)
Xie, Q., Dai, Z., Hovy, E., Luong, M.T., Le, Q.V.: Unsupervised data augmentation for consistency training. arXiv preprint arXiv:1904.12848 (2019)
Wei, J., Zou, K.: Eda: Easy data augmentation techniques for boosting performance on text classification tasks. arXiv preprint arXiv:1901.11196 (2019)
Anaby-Tavor, A., Carmeli, B., Goldbraich, E., Kantor, A., Kour, G., Shlomov, S., et al.: Not enough data? Deep Learning to the Rescue! (2019)
Kobayashi, S.: Contextual augmentation: data augmentation by words with paradigmatic relations. arXiv preprint arXiv:1805.06201 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Jia, G., Zhu, W., Tang, J., Zhang, W. (2021). Leveraging Artificial Intelligence in Medicine Compliance Check. In: Nah, F.FH., Siau, K. (eds) HCI in Business, Government and Organizations. HCII 2021. Lecture Notes in Computer Science(), vol 12783. Springer, Cham. https://doi.org/10.1007/978-3-030-77750-0_37
Download citation
DOI: https://doi.org/10.1007/978-3-030-77750-0_37
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-77749-4
Online ISBN: 978-3-030-77750-0
eBook Packages: Computer ScienceComputer Science (R0)