abhishek-ch commited on
Commit
aa3749b
1 Parent(s): c17b3ea

Updated the README

Browse files
Files changed (1) hide show
  1. README.md +37 -2
README.md CHANGED
@@ -28,15 +28,50 @@ pipeline_tag: text-generation
28
  # abhishek-ch/biomistral-7b-synthetic-ehr
29
  This model was converted to MLX format from [`BioMistral/BioMistral-7B-DARE`]().
30
  Refer to the [original model card](https://huggingface.co/BioMistral/BioMistral-7B-DARE) for more details on the model.
 
 
31
  ## Use with mlx
32
 
33
  ```bash
34
  pip install mlx-lm
35
  ```
36
 
 
 
37
  ```python
38
- from mlx_lm import load, generate
 
 
 
 
 
 
 
 
 
 
 
 
 
 
39
 
 
 
 
 
 
 
 
 
 
40
  model, tokenizer = load("abhishek-ch/biomistral-7b-synthetic-ehr")
41
- response = generate(model, tokenizer, prompt="hello", verbose=True)
 
 
 
 
 
 
 
42
  ```
 
 
28
  # abhishek-ch/biomistral-7b-synthetic-ehr
29
  This model was converted to MLX format from [`BioMistral/BioMistral-7B-DARE`]().
30
  Refer to the [original model card](https://huggingface.co/BioMistral/BioMistral-7B-DARE) for more details on the model.
31
+
32
+
33
  ## Use with mlx
34
 
35
  ```bash
36
  pip install mlx-lm
37
  ```
38
 
39
+ The model was fine-tuned on [health_facts](https://huggingface.co/datasets/health_fact) and Synthetic EHR dataset inspired by MIMIC-IV, for 1000 steps using mlx
40
+
41
  ```python
42
+ def format_prompt(prompt:str, question: str) -> str:
43
+ return """<s>[INST]
44
+ ## Instructions
45
+ {}
46
+ ## User Question
47
+ {}.
48
+ [/INST]</s>
49
+ """.format(prompt, question)
50
+ ```
51
+
52
+ Example For EHR Diagnosis
53
+ ```
54
+ Prompt = """You are an expert in provide diagnosis summary based on clinical notes.
55
+ Objective: Your task is to generate concise summaries of the diagnosis, focusing on critical information"""
56
+ ```
57
 
58
+ Example for Healthfacts Check
59
+ ```
60
+ Prompt: You are a Public Health AI Assistant. You can do the fact-checking of public health claims. \nEach answer labelled with true, false, unproven or mixture. \nPlease provide the reason behind the answer
61
+ ```
62
+
63
+ ## Model Loading Using mlx
64
+
65
+ ```python
66
+ from mlx_lm import generate, load
67
  model, tokenizer = load("abhishek-ch/biomistral-7b-synthetic-ehr")
68
+ response = generate(
69
+ fused_model,
70
+ fused_tokenizer,
71
+ prompt=format_prompt(prompt, question),
72
+ verbose=True, # Set to True to see the prompt and response
73
+ temp=0.0,
74
+ max_tokens=512,
75
+ )
76
  ```
77
+