Notes
This is not intended for end usage, unless you specifically know why you'd want to use a no-frills Qwen2.5 model, merged using highly stable methods from proven ancestors. It's a baseline merge component.
Merge Details
Merge Method
This model was merged using the TIES merge method using merges/Qwentessential-14B-slerp as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
base_model: merges/Qwentessential-14B-slerp
dtype: float32
merge_method: ties
name: Qwentessential-14B-v3
out_dtype: bfloat16
parameters:
density: 1.0
int8_mask: true
normalize: true
rescale: false
weight: 1.0
slices:
- sources:
- layer_range:
- 0
- 2
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 2
- 6
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 6
- 10
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 10
- 14
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 14
- 18
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 18
- 22
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 22
- 26
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 26
- 30
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 30
- 34
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 34
- 38
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 38
- 42
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 42
- 46
model: merges/Qwentessential-14B-slerp
- sources:
- layer_range:
- 46
- 48
model: merges/Qwentessential-14B-slerp
tokenizer_source: base
- Downloads last month
- 29
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for sometimesanotion/Qwentessential-14B-v3
Merge model
this model