abhishek-ch's picture
Updated the README
aa3749b verified
|
raw
history blame
1.74 kB
metadata
language:
  - en
  - fr
  - nl
  - es
  - it
  - pl
  - ro
  - de
license: apache-2.0
library_name: transformers
tags:
  - mergekit
  - merge
  - dare
  - medical
  - biology
  - mlx
datasets:
  - pubmed
base_model:
  - BioMistral/BioMistral-7B
  - mistralai/Mistral-7B-Instruct-v0.1
pipeline_tag: text-generation

abhishek-ch/biomistral-7b-synthetic-ehr

This model was converted to MLX format from BioMistral/BioMistral-7B-DARE. Refer to the original model card for more details on the model.

Use with mlx

pip install mlx-lm

The model was fine-tuned on health_facts and Synthetic EHR dataset inspired by MIMIC-IV, for 1000 steps using mlx

def format_prompt(prompt:str, question: str) -> str:
    return """<s>[INST]
## Instructions
{}
## User Question
{}.
[/INST]</s> 
""".format(prompt, question)

Example For EHR Diagnosis

Prompt = """You are an expert in provide diagnosis summary based on clinical notes.
Objective: Your task is to generate concise summaries of the diagnosis, focusing on critical information"""

Example for Healthfacts Check

Prompt: You are a Public Health AI Assistant. You can do the fact-checking of public health claims. \nEach answer labelled with true, false, unproven or mixture. \nPlease provide the reason behind the answer

Model Loading Using mlx

from mlx_lm import generate, load
model, tokenizer = load("abhishek-ch/biomistral-7b-synthetic-ehr")
response = generate(
    fused_model,
    fused_tokenizer,
    prompt=format_prompt(prompt, question),
    verbose=True,  # Set to True to see the prompt and response
    temp=0.0,
    max_tokens=512,
)