Abstract
General large language models (LLMs) such as ChatGPT have shown remarkable success. However, such LLMs have not been widely adopted for medical purposes, due to poor accuracy and inability to provide medical advice. We propose IvyGPT, an LLM based on LLaMA that is trained and fine-tuned with high-quality medical question-answer (QA) instances and Reinforcement Learning from Human Feedback (RLHF). In the training, we used QLoRA to handle 33 billion parameters on a small number of NVIDIA A100 (80 GB) GPUs. Experimental results show that IvyGPT has outperformed other medical GPT models. The online demo is available at http://81.71.71.157:52022. Our demo video can be found at https://youtu.be/O4D74pQh8Is.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Dettmers, T., Pagnoni, A., Holtzman, A., Zettlemoyer, L.: QLORA: efficient finetuning of quantized LLMs. arXiv preprint arXiv:2305.14314 (2023)
Liu, H., Liao, Y., Meng, Y., Wang, Y., Wang, Y.: MedicalGPT-zh (2023). https://github.com/MediaBrain-SJTU/MedicalGPT-zh
Hu, E.J., et al.: LoRA: low-rank adaptation of large language models. arXiv preprint arXiv:2106.09685 (2021)
Li, J., Pang, P.C.I., Xiao, Y., Wong, D.: Changes in doctor-patient relationships in china during COVID-19: a text mining analysis. Int. J. Environ. Res. Public Health 19(20), 13446 (2022)
Ouyang, L., et al.: Training language models to follow instructions with human feedback (2022)
Touvron, H., et al.: LLaMA: open and efficient foundation language models (2023)
Wang, H., et al.: HuaTuo: tuning LLaMA model with Chinese medical knowledge (2023)
Zhang, H., et al.: HuatuoGPT, towards taming language model to be a doctor (2023)
Zhu, W., Wang, X.: ChatMed: a Chinese medical large language model (2023). https://github.com/michael-wzhu/ChatMed
Zhu, W., Wang, X.: ShenNong-TCM-LLM (2023). https://github.com/michael-wzhu/ShenNong-TCM-LLM
Funding
This work was funded by the Science and Technology Development Fund of Macau SAR (Grant Number 0105/2022/A and File Number 0041/2023/RIB2).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Wang, R. et al. (2024). IvyGPT: InteractiVe Chinese Pathway Language Model in Medical Domain. In: Fang, L., Pei, J., Zhai, G., Wang, R. (eds) Artificial Intelligence. CICAI 2023. Lecture Notes in Computer Science(), vol 14474. Springer, Singapore. https://doi.org/10.1007/978-981-99-9119-8_34
Download citation
DOI: https://doi.org/10.1007/978-981-99-9119-8_34
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-9118-1
Online ISBN: 978-981-99-9119-8
eBook Packages: Computer ScienceComputer Science (R0)