Ppoyaa commited on
Commit
f8797ee
1 Parent(s): 393f864

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -52
README.md CHANGED
@@ -1,54 +1,2 @@
1
  ---
2
- base_model:
3
- - Ppoyaa/LuminRP-7B-128k-v0.5
4
- library_name: transformers
5
- tags:
6
- - mergekit
7
- - merge
8
 
9
- ---
10
- # merge
11
-
12
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
13
-
14
- ## Merge Details
15
- ### Merge Method
16
-
17
- This model was merged using the passthrough merge method.
18
-
19
- ### Models Merged
20
-
21
- The following models were included in the merge:
22
- * [Ppoyaa/LuminRP-7B-128k-v0.5](https://huggingface.co/Ppoyaa/LuminRP-7B-128k-v0.5)
23
-
24
- ### Configuration
25
-
26
- The following YAML configuration was used to produce this model:
27
-
28
- ```yaml
29
- slices:
30
- - sources:
31
- - layer_range: [0, 8]
32
- model: Ppoyaa/LuminRP-7B-128k-v0.5
33
- - sources:
34
- - layer_range: [4, 12]
35
- model: Ppoyaa/LuminRP-7B-128k-v0.5
36
- - sources:
37
- - layer_range: [8, 16]
38
- model: Ppoyaa/LuminRP-7B-128k-v0.5
39
- - sources:
40
- - layer_range: [12, 20]
41
- model: Ppoyaa/LuminRP-7B-128k-v0.5
42
- - sources:
43
- - layer_range: [16, 24]
44
- model: Ppoyaa/LuminRP-7B-128k-v0.5
45
- - sources:
46
- - layer_range: [20, 28]
47
- model: Ppoyaa/LuminRP-7B-128k-v0.5
48
- - sources:
49
- - layer_range: [24, 32]
50
- model: Ppoyaa/LuminRP-7B-128k-v0.5
51
- merge_method: passthrough
52
- dtype: bfloat16
53
-
54
- ```
 
1
  ---
 
 
 
 
 
 
2