llama3.1-405B-upscaled-603B / mergekit_config.yml
giannisan's picture
Upload folder using huggingface_hub
0a6e3a9 verified
raw
history blame contribute delete
196 Bytes
slices:
- sources:
- model: ../llama-3.1-405B
layer_range: [0, 94]
- sources:
- model: ../llama-3.1-405B
layer_range: [32, 126]
merge_method: passthrough
dtype: bfloat16