metadata
license: llama2
pipeline_tag: text-generation
BigLiz 120B
A Goliath-120b style frankenmerge of lzlv-70b and WinterGoddess-1.4x-70b.
Prompting Format
Vicuna and Alpaca.
Merge process
The models used in the merge are lzlv-70b and WinterGoddess-1.4x-70b.
slices:
- sources:
- model: lizpreciatior_lzlv_70b_fp16_hf
layer_range: [0, 16]
- sources:
- model: Sao10K_WinterGoddess-1.4x-70B-L2
layer_range: [8, 24]
- sources:
- model: lizpreciatior_lzlv_70b_fp16_hf
layer_range: [17, 32]
- sources:
- model: Sao10K_WinterGoddess-1.4x-70B-L2
layer_range: [25, 40]
- sources:
- model: lizpreciatior_lzlv_70b_fp16_hf
layer_range: [33, 48]
- sources:
- model: Sao10K_WinterGoddess-1.4x-70B-L2
layer_range: [41, 56]
- sources:
- model: lizpreciatior_lzlv_70b_fp16_hf
layer_range: [49, 64]
- sources:
- model: Sao10K_WinterGoddess-1.4x-70B-L2
layer_range: [57, 72]
- sources:
- model: lizpreciatior_lzlv_70b_fp16_hf
layer_range: [65, 80]
merge_method: passthrough
dtype: float16
Acknowledgements
@lizpreciatior For creating lzlv
@Sao10K For creating WinterGoddess
@alpindale For creating the original Goliath
@chargoddard For developing mergekit.