Base Model : mosaicml/mpt-30b
Tool : MosaicML's llm-foundry (https://github.com/mosaicml/llm-foundry)
Dataset : Entire flan3m-GPT3.5 dataset.
Config yaml with Model Params : https://huggingface.co/iamplus/mpt-30b-v2/blob/main/mpt-30b_orca.yaml
Prompt Format :
<system>: [system prompt]
<human>: [question]
<bot>:
- Downloads last month
- 21
Inference API (serverless) does not yet support model repos that contain custom code.