Lewdiculous commited on
Commit
8604c12
1 Parent(s): af8f6c1

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +44 -0
README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context
4
+ - Nitral-AI/Infinitely-Laydiculous-7B
5
+ library_name: transformers
6
+ tags:
7
+ - mergekit
8
+ - merge
9
+ - roleplay
10
+ ---
11
+
12
+
13
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/ThhZa1NaOwj6V2iHL_rsn.png)
14
+
15
+ This model was merged using the SLERP merge method.
16
+
17
+ ### Models Merged
18
+
19
+ The following models were included in the merge:
20
+ * [Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context](https://huggingface.co/Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context)
21
+ * [Nitral-AI/Infinitely-Laydiculous-7B](https://huggingface.co/Nitral-AI/Infinitely-Laydiculous-7B)
22
+
23
+ ### Configuration
24
+
25
+ The following YAML configuration was used to produce this model:
26
+
27
+ ```yaml
28
+ slices:
29
+ - sources:
30
+ - model: Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context
31
+ layer_range: [0, 32]
32
+ - model: Nitral-AI/Infinitely-Laydiculous-7B
33
+ layer_range: [0, 32]
34
+ merge_method: slerp
35
+ base_model: Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context
36
+ parameters:
37
+ t:
38
+ - filter: self_attn
39
+ value: [0, 0.5, 0.3, 0.7, 1]
40
+ - filter: mlp
41
+ value: [1, 0.5, 0.7, 0.3, 0]
42
+ - value: 0.5
43
+ dtype: bfloat16
44
+ ```