BigWeave-v6-90b / README.md
llmixer's picture
Create README.md
8b52305 verified
|
raw
history blame
No virus
1.24 kB
---
license: llama2
language:
- en
pipeline_tag: conversational
tags:
- Xwin
- Euryale 1.3
- frankenmerge
- 90b
---
# BigWeave v6 90B
<img src="https://cdn-uploads.huggingface.co/production/uploads/65a6db055c58475cf9e6def1/4CbbAN-X7ZWj702JrcCGH.png" width=600>
A Goliath-120b style frankenmerge of Xwin-LM-70b-v0.1 and Euryale-1.3-70b. The goal is to find other merge combinations that work well.
The version number is for me to keep track of the merges, only results that seem to work reasonably well are kept/published.
# Prompting Format
Vicuna and Alpaca.
# Merge process
The models used in the merge are [Xwin-LM-70b-v0.1](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1) and [Euryale-1.3-70b](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B).
The layer mix:
```yaml
- range 0, 12
Xwin
- range 9, 14
Euryale
- range 12, 62
Xwin
- range 54, 71
Euryale
- range 62, 80
Xwin
```
# Acknowledgements
[@Xwin-LM](https://huggingface.co/Xwin-LM) For creating Xwin
[@Sao10K](https://huggingface.co/Sao10K) For creating Euryale
[@alpindale](https://huggingface.co/alpindale) For creating the original Goliath
[@chargoddard](https://huggingface.co/chargoddard) For developing [mergekit](https://github.com/cg123/mergekit).