Special
Collection
10 items
•
Updated
•
3
This is a merge of pre-trained language models created using mergekit.
This model was merged using the SLERP merge method.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: jaspionjader/bh-57
layer_range:
- 0
- 32
- model: jaspionjader/f-5-8b
layer_range:
- 0
- 32
merge_method: slerp
base_model: jaspionjader/bh-57
parameters:
t:
- filter: self_attn
value:
- 0.09
- 0.05
- 0.07
- 0.08
- 0.06
- filter: mlp
value:
- 0.06
- 0.08
- 0.07
- 0.05
- 0.09
- value: 0.07
dtype: bfloat16