NeuralMaxime-7B-DPO / README.md
Kukedlc's picture
Update README.md
8ffbf1a verified
---
license: apache-2.0
datasets:
- Intel/orca_dpo_pairs
tags:
- code
---
# NeuralMaxime 7b DPO
![](https://raw.githubusercontent.com/kukedlc87/imagenes/main/DALL%C2%B7E%202024-02-19%2001.43.13%20-%20A%20futuristic%20and%20technological%20image%20featuring%20a%20robot%20whose%20face%20is%20a%20screen%20displaying%20the%20text%20'DPO'.%20The%20scene%20symbolizes%20the%20technique%20for%20fine-t.webp)
## DPO Intel - Orca
## Merge - MergeKit
## Models : NeuralMonarch & AlphaMonarch (MLabonne)