Skip to content

Latest commit

 

History

History
20 lines (18 loc) · 843 Bytes

README.md

File metadata and controls

20 lines (18 loc) · 843 Bytes

Information Extraction from Clinical Notes: Are We Ready for Adopting Large Language Models?

Overview

This repository contains the code and resources for using a BERT model for Kiwi, designed to extract medical problem, treatment, test, and drug entities, and their modifiers from clinical notes.

Environment

git clone https://github.com/BIDS-Xu-Lab/Kiwi-BERT.git
cd Kiwi-BERT
pip install -r requirements.txt

Citation

@article{hu2024information,
  title={Information Extraction from Clinical Notes: Are We Ready to Switch to Large Language Models?},
  author={Hu, Yan and Zuo, Xu and Zhou, Yujia and Peng, Xueqing and Huang, Jimin and Keloth, Vipina K and Zhang, Vincent J and Weng, Ruey-Ling and Chen, Qingyu and Jiang, Xiaoqian and others},
  journal={arXiv preprint arXiv:2411.10020},
  year={2024}
}