L3-8B-LingYang-v2 / mergekit_config.yml
wwe180's picture
Upload folder using huggingface_hub
010f412 verified
raw
history blame
232 Bytes
slices:
- sources:
- model: "Sao10K/L3-8B-Stheno-v3.2+hfl/llama-3-chinese-8b-instruct-v2-lora"
layer_range: [0,32]
merge_method: passthrough
base_model: "gradientai/Llama-3-8B-Instruct-Gradient-1048k"
dtype: bfloat16