Add README
Browse files
README.md
ADDED
@@ -0,0 +1,35 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
language:
|
4 |
+
- en
|
5 |
+
pipeline_tag: text-generation
|
6 |
+
inference: false
|
7 |
+
tags:
|
8 |
+
- dare
|
9 |
+
- super mario merge
|
10 |
+
- pytorch
|
11 |
+
- mixtral
|
12 |
+
---
|
13 |
+
|
14 |
+
# mixtral dare test
|
15 |
+
|
16 |
+
The following were merged with DARE using [https://github.com/martyn/safetensors-merge-supermario](https://github.com/martyn/safetensors-merge-supermario)
|
17 |
+
|
18 |
+
## Mergelist
|
19 |
+
|
20 |
+
|
21 |
+
```
|
22 |
+
mistralai/Mixtral-8x7B-Instruct-v0.1
|
23 |
+
Open-Orca/Mixtral-SlimOrca-8x7B
|
24 |
+
```
|
25 |
+
|
26 |
+
## Merge command
|
27 |
+
|
28 |
+
```
|
29 |
+
python3 hf_merge.py to_merge_mixtral0.txt mixtral-0 -p 0.3 -lambda 2.1
|
30 |
+
```
|
31 |
+
|
32 |
+
## Notes
|
33 |
+
|
34 |
+
* This is primarily a test to see if merging mixtral models works.
|
35 |
+
* I skip merge on the MoE gates.
|