File size: 212 Bytes
23b2e39 |
1 2 3 4 5 6 7 8 9 |
models:
- model: prithivMLmods/QwQ-LCoT2-7B-Instruct
- model: prithivMLmods/QwQ-LCoT-7B-Instruct
merge_method: model_stock
base_model: Qwen/Qwen2.5-7B-Instruct
normalize: true
int8_mask: true
dtype: bfloat16
|