Nemo-DPO-V23 / mergekit_config.yml
cloudyu's picture
Upload folder using huggingface_hub
5e5884c verified
raw
history blame contribute delete
180 Bytes
models:
- model: ./Nemo-DPO-V20
- model: ./Nemo-DPO-V21
merge_method: model_stock
base_model: ./Nemo-DPO-V22
parameters:
normalize: false
int8_mask: true
dtype: bfloat16