license: apache-2.0
language:
- zh
library_name: transformers
pipeline_tag: text-generation
tags:
- 'Doctor_consultation '
- Taiwan
- fine-tuning
Taiwan-inquiry_7B_v1.0
"The model was fine-tuned based on the Taiwan-LLM-7B-v2.1-chat model using a dataset that includes 614 authentic dialogues from the National Cheng Kung University Hospital. Additionally, 101 synthetic dialogues were included in the training set. The dialogue content covers topics referenced from OSCE (臨床技能測驗) sample questions in Taiwan. These synthetic dialogues were generated using GPT-3.5 and Geminio-Pro. The training process aimed to enhance the model's performance in understanding and generating responses related to clinical skills assessment scenarios in a medical context."
Model Description
- Developed by: Joseph (Chen-Wei) Li, researcher assistant from National Taiwan University Hospital.
- Model type: A 7B parameter GPT-like model fine-tuned on a combination of private and synthetic dialogue datasets.
- Language(s) (NLP): Traditional Chinese (zh-tw)
- Finetuned from model : yentinglin/Taiwan-LLM-7B-v2.1-chat
Usage of the model
The user can take on the role of a doctor, and the model can engage in conversation with you as if it were a patient. You can provide the model with a brief patient background in the system prompt, and the model will respond based on that prompt.
Example
- prompt: 一名50歲先生抱怨視線模糊和眼睛乾澀,請進行眼科檢查。