MT7Bi-dpo
Technoculture/MT7Bi-sft (base) + Technoculture/MT7Bi-alpha-dpo-v0.2 (adapter)
Open LLM Leaderboard
Model Name | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
---|---|---|---|---|---|---|
Orca-2-7b | 78.4 | 76.1 | 53.7 | 52.4 | 74.2 | 47.2 |
LLAMA-2-7b | 43.2 | 77.1 | 44.4 | 38.7 | 69.5 | 16 |
MT7Bi-sft | 54.1 | 75.11 | - | 43.08 | 72.14 | 15.54 |
MT7bi-dpo | 54.69 | 75.89 | 52.82 | 45.48 | 71.58 | 25.93 |
- Downloads last month
- 8
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for Technoculture/MT7Bi-dpo
Base model
Technoculture/MT7Bi-sft