|
--- |
|
base_model: |
|
- 152334H/miqu-1-70b-sf |
|
license: unknown |
|
language: |
|
- en |
|
pipeline_tag: text-generation |
|
tags: |
|
- merge |
|
- frankenmerge |
|
- 95b |
|
--- |
|
# BigWeave v26 95b |
|
|
|
<img src="https://cdn-uploads.huggingface.co/production/uploads/65a6db055c58475cf9e6def1/4CbbAN-X7ZWj702JrcCGH.png" width=600> |
|
|
|
The BigWeave models aim to experimentally identify merge settings for increasing model performance. The version number merely tracks various attempts and is not a quality indicator. Only results demonstrating good performance are retained and shared. |
|
|
|
# Prompting Format |
|
Chatml, Mistral, Vicuna. |
|
|
|
# Merge process |
|
This is a self-merge of 152334H/miqu-1-70b-sf. The last 30 layers are duplicated in groups of 10 layers. According to exl2 measurements, these are among the most important layers. |
|
|
|
Merge configuration: |
|
``` |
|
slices: |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [0,54] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [49,59] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [54,64] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [59,69] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [64,74] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [69,79] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [74,80] |
|
merge_method: passthrough |
|
dtype: float16 |
|
|
|
``` |