File size: 1,681 Bytes
87a333a
fc5b9be
c3792c5
 
87a333a
5f17436
 
01c2178
b37f6af
a0ecbac
4769950
c3792c5
4769950
c3792c5
87a333a
c3792c5
 
 
87a333a
01c2178
4769950
01c2178
4769950
01c2178
4769950
01c2178
 
 
 
5f17436
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
An attempt to make ParasiticRogue's model a tad better on longer contexts. I just ran the script, all credit for the original merge goes to my friend.

![image/png](https://cdn-uploads.huggingface.co/production/uploads/6550b16f7490049d6237f200/1aFzrFX2W4cilMSZnoX2k.png)

Exl2 quants already being uploaded by him:
https://huggingface.co/ParasiticRogue/RP-Stew-v2.5-34B-exl2-4.65

Also, my samplers, instruct and prompt for the model (updated, new format):

Samplers: https://files.catbox.moe/8ficm1.json

Instruct: https://files.catbox.moe/nlflxw.json

Story String: https://files.catbox.moe/6bk6gj.json

If you're using Vectors (I use them for summaries of different story arcs):

![image/png](https://cdn-uploads.huggingface.co/production/uploads/6550b16f7490049d6237f200/-Adevxuhmywjrs2WtexZK.png)

Alternative samplers and an "old" version of the prompt, plus instruct (classic format):

Samplers: https://files.catbox.moe/ef67mj.json

Instruct: https://files.catbox.moe/uba6o1.json

Story String: https://files.catbox.moe/t5gfun.json

I'm unsure which one works better, so it's up to your tests.

```
models:
  - model: F:\Merge\ParasiticRogue_Nontoxic-PiVoT-Bagel-RP-34b
    parameters:
      weight: 0.16
      density: 0.42
  - model: F:\Merge\ParasiticRogue_Nyakura-CausalLM-RP-34B
    parameters:
      weight: 0.22
      density: 0.54
  - model: F:\Merge\migtissera_Tess-34B-v1.5b
    parameters:
      weight: 0.28
      density: 0.66
  - model: F:\Merge\brucethemoose_Capybara-Fixed-Temp
    parameters:
      weight: 0.34
      density: 0.78
merge_method: dare_ties
base_model: F:\Merge\chargoddard_Yi-34B-200K-Llama
parameters:
  int8_mask: true
dtype: bfloat16
```