File size: 2,959 Bytes
fc11cba
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7f9e2cd
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
fc11cba
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: llama2
language:
- en
pipeline_tag: conversational
tags:
- Xwin
- Euryale 1.3
- Platypus2
- WinterGoddess
- frankenmerge
- dare
- ties
- 90b
---
# BigWeave v9 90B

<img src="https://cdn-uploads.huggingface.co/production/uploads/65a6db055c58475cf9e6def1/4CbbAN-X7ZWj702JrcCGH.png" width=600>

The BigWeave models aim to identify merge settings equaling or surpassing the performance of Goliath-120b. The version number merely tracks various attempts and is not a quality indicator. Only results demonstrating good performance are retained and shared.

This version is a DARE-TIES merge of two passthrough merges: Xwin-LM-70b-v0.1 + Euryale-1.3-70b ([BigWeave v6](https://huggingface.co/llmixer/BigWeave-v6-90b)) and Platypus2-70b-instruct + WinterGoddess-1.4x-70b (BigWeave v8). Both models individually show strong performance, and the merged model achieves even lower perplexity than each model separately.

The 90b size allows for 4bit quants to fit into 48GB of VRAM.

# Prompting Format
Vicuna and Alpaca.

# Merge process
The models used in the merge are [Xwin-LM-70b-v0.1](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1), [Euryale-1.3-70b](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B), [Platypus2-70b-instruct](https://huggingface.co/garage-bAInd/Platypus2-70B-instruct) and [WinterGoddess-1.4x-70b](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2).

Merge configuration:
```
slices:
  - sources:
    - model: Xwin-LM/Xwin-LM-70B-V0.1
      layer_range: [0,12]
  - sources:
    - model: Sao10K/Euryale-1.3-L2-70B
      layer_range: [9,14]
  - sources:
    - model: Xwin-LM/Xwin-LM-70B-V0.1
      layer_range: [12,62]
  - sources:
    - model: Sao10K/Euryale-1.3-L2-70B
      layer_range: [54,71]
  - sources:
    - model: Xwin-LM/Xwin-LM-70B-V0.1
      layer_range: [62,80]
merge_method: passthrough
dtype: float16
---
slices:
  - sources:
    - model: garage-bAInd/Platypus2-70B-instruct
      layer_range: [0,12]
  - sources:
    - model: Sao10K/WinterGoddess-1.4x-70B-L2
      layer_range: [9,14]
  - sources:
    - model: garage-bAInd/Platypus2-70B-instruct
      layer_range: [12,62]
  - sources:
    - model: Sao10/WinterGoddess-1.4x-70B-L2
      layer_range: [54,71]
  - sources:
    - model: garage-bAInd/Platypus2-70B-instruct
      layer_range: [62,80]
merge_method: passthrough
dtype: float16
---
models:
    - model: llmixer/BigWeave-v8-90b
      parameters:
        weight: 0.5
        density: 0.5
merge_method: dare_ties
base_model: llmixer/BigWeave-v6-90b
dtype: float16
```

# Acknowledgements
[@Xwin-LM](https://huggingface.co/Xwin-LM) For creating Xwin

[@Sao10K](https://huggingface.co/Sao10K) For creating Euryale and WinterGoddess

[@garage-bAInd](https://huggingface.co/garage-bAInd) For creating Platypus2

[@alpindale](https://huggingface.co/alpindale) For creating the original Goliath

[@chargoddard](https://huggingface.co/chargoddard) For developing [mergekit](https://github.com/cg123/mergekit).