|
--- |
|
license: mit |
|
language: |
|
- en |
|
pipeline_tag: text-generation |
|
inference: false |
|
tags: |
|
- dare |
|
- super mario merge |
|
- pytorch |
|
- mixtral |
|
- merge |
|
--- |
|
|
|
# mixtral dare test |
|
|
|
The following were merged with DARE using [https://github.com/martyn/safetensors-merge-supermario](https://github.com/martyn/safetensors-merge-supermario) |
|
|
|
## Mergelist |
|
|
|
|
|
``` |
|
mistralai/Mixtral-8x7B-Instruct-v0.1 |
|
Open-Orca/Mixtral-SlimOrca-8x7B |
|
``` |
|
|
|
## Merge command |
|
|
|
``` |
|
python3 hf_merge.py to_merge_mixtral0.txt mixtral-0 -p 0.3 -lambda 2.1 |
|
``` |
|
|
|
## Notes |
|
|
|
* This is primarily a test to see if merging mixtral models works. |
|
* MoE gates are not merged. |