metadata
library_name: transformers
base_model:
- mistralai/Mistral-Large-Instruct-2411
datasets:
- jondurbin/gutenberg-dpo-v0.1
- nbeerbower/gutenberg2-dpo
- nbeerbower/gutenberg-moderne-dpo
license: other
license_name: mrl
Gigaberg-Mistral-Large-123B
mistralai/Mistral-Large-Instruct-2411 finetuned on jondurbin/gutenberg-dpo-v0.1, nbeerbower/gutenberg2-dpo, and nbeerbower/gutenberg-moderne-dpo.
Method
ORPO tuned with 1x H200 for 1 epoch.