frqwen2.5-from72b-duable10layers / mergekit_config.yml
ehristoforu's picture
Upload folder using huggingface_hub
a94ec64 verified
raw
history blame
315 Bytes
slices:
- sources:
- layer_range: [0, 40]
model: Qwen/Qwen2.5-72B-Instruct
- sources:
- layer_range: [30, 40]
model: Nexusflow/Athene-V2-Chat
- sources:
- layer_range: [40, 80]
model: Qwen/Qwen2.5-72B-Instruct
merge_method: passthrough
dtype: bfloat16
tokenizer_source: "Qwen/Qwen2.5-72B-Instruct"