llmixer commited on
Commit
8697fc5
1 Parent(s): 5c6e762

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -28
README.md CHANGED
@@ -1,13 +1,6 @@
1
  ---
2
  license: llama2
3
- language:
4
- - en
5
- pipeline_tag: conversational
6
- tags:
7
- - lzlv
8
- - WinterGoddess
9
- - frankenmerge
10
- - 120b
11
  ---
12
  # BigLiz 120B
13
 
@@ -21,26 +14,37 @@ Vicuna and Alpaca.
21
  # Merge process
22
  The models used in the merge are [lzlv-70b](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf) and [WinterGoddess-1.4x-70b](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2).
23
 
24
- The layer mix:
25
  ```yaml
26
- - range 0, 16
27
- lzlv
28
- - range 8, 24
29
- WinterGoddess
30
- - range 17, 32
31
- lzlv
32
- - range 25, 40
33
- WinterGoddess
34
- - range 33, 48
35
- lzlv
36
- - range 41, 56
37
- WinterGoddess
38
- - range 49, 64
39
- lzlv
40
- - range 57, 72
41
- WinterGoddess
42
- - range 65, 80
43
- lzlv
 
 
 
 
 
 
 
 
 
 
 
 
44
  ```
45
 
46
  # Acknowledgements
@@ -50,4 +54,4 @@ The layer mix:
50
 
51
  [@alpindale](https://huggingface.co/alpindale) For creating the original Goliath
52
 
53
- [@chargoddard](https://huggingface.co/chargoddard) For developing [mergekit](https://github.com/cg123/mergekit).
 
1
  ---
2
  license: llama2
3
+ pipeline_tag: text-generation
 
 
 
 
 
 
 
4
  ---
5
  # BigLiz 120B
6
 
 
14
  # Merge process
15
  The models used in the merge are [lzlv-70b](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf) and [WinterGoddess-1.4x-70b](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2).
16
 
 
17
  ```yaml
18
+ slices:
19
+ - sources:
20
+ - model: lizpreciatior_lzlv_70b_fp16_hf
21
+ layer_range: [0, 16]
22
+ - sources:
23
+ - model: Sao10K_WinterGoddess-1.4x-70B-L2
24
+ layer_range: [8, 24]
25
+ - sources:
26
+ - model: lizpreciatior_lzlv_70b_fp16_hf
27
+ layer_range: [17, 32]
28
+ - sources:
29
+ - model: Sao10K_WinterGoddess-1.4x-70B-L2
30
+ layer_range: [25, 40]
31
+ - sources:
32
+ - model: lizpreciatior_lzlv_70b_fp16_hf
33
+ layer_range: [33, 48]
34
+ - sources:
35
+ - model: Sao10K_WinterGoddess-1.4x-70B-L2
36
+ layer_range: [41, 56]
37
+ - sources:
38
+ - model: lizpreciatior_lzlv_70b_fp16_hf
39
+ layer_range: [49, 64]
40
+ - sources:
41
+ - model: Sao10K_WinterGoddess-1.4x-70B-L2
42
+ layer_range: [57, 72]
43
+ - sources:
44
+ - model: lizpreciatior_lzlv_70b_fp16_hf
45
+ layer_range: [65, 80]
46
+ merge_method: passthrough
47
+ dtype: float16
48
  ```
49
 
50
  # Acknowledgements
 
54
 
55
  [@alpindale](https://huggingface.co/alpindale) For creating the original Goliath
56
 
57
+ [@chargoddard](https://huggingface.co/chargoddard) For developing [mergekit](https://github.com/cg123/mergekit).