File size: 601 Bytes
8e5b58b
 
 
 
 
920523f
2cce63b
920523f
8e5b58b
920523f
2cce63b
 
1
2
3
4
5
6
7
8
9
10
11
12
---

library_name: transformers
datasets:
- wzhouad/zephyr-ultrafeedback-hybrid
---


## Description

mistral-7b-sft-beta model finetuned by hybrid WPO (GPT-4-turbo + on-policy sampling + Ultrafeedback). Details in [WPO: Enhancing RLHF with Weighted Preference Optimization](https://arxiv.org/abs/2406.11827). The training data is [wzhouad/zephyr-ultrafeedback-hybrid](https://huggingface.co/datasets/wzhouad/zephyr-ultrafeedback-hybrid).

## License
This model is licensed under the Zoom software license and is permitted for use only for noncommercial, educational, or academic research purposes.