Text Generation
Adapters
Safetensors
mixtral
Edit model card

Model Card for Model Swisslex/Mixtral-8x7b-DPO-v0.2

Model Details

Model Description

Finetuned version of mistralai/Mixtral-8x7B-v0.2 using SFT and DPO.

  • Developed by: Swisslex
  • Language(s) (NLP): English, German, French, Italian, Spanish
  • License: apache-2.0
  • Finetuned from model [optional]: mistralai/Mixtral-8x7B-v0.2
Downloads last month
0
Safetensors
Model size
46.7B params
Tensor type
BF16
·
Inference Examples
Inference API (serverless) does not yet support adapter-transformers models for this pipeline type.

Datasets used to train Swisslex/Mixtral-8x7b-DPO-v0.2