--- license: llama2 pipeline_tag: text-generation --- # BigLiz 120B A Goliath-120b style frankenmerge of lzlv-70b and WinterGoddess-1.4x-70b. # Prompting Format Vicuna and Alpaca. # Merge process The models used in the merge are [lzlv-70b](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf) and [WinterGoddess-1.4x-70b](https://huggingface.co/Sao10K/WinterGoddess-1.4x-70B-L2). ```yaml slices: - sources: - model: lizpreciatior_lzlv_70b_fp16_hf layer_range: [0, 16] - sources: - model: Sao10K_WinterGoddess-1.4x-70B-L2 layer_range: [8, 24] - sources: - model: lizpreciatior_lzlv_70b_fp16_hf layer_range: [17, 32] - sources: - model: Sao10K_WinterGoddess-1.4x-70B-L2 layer_range: [25, 40] - sources: - model: lizpreciatior_lzlv_70b_fp16_hf layer_range: [33, 48] - sources: - model: Sao10K_WinterGoddess-1.4x-70B-L2 layer_range: [41, 56] - sources: - model: lizpreciatior_lzlv_70b_fp16_hf layer_range: [49, 64] - sources: - model: Sao10K_WinterGoddess-1.4x-70B-L2 layer_range: [57, 72] - sources: - model: lizpreciatior_lzlv_70b_fp16_hf layer_range: [65, 80] merge_method: passthrough dtype: float16 ``` # Acknowledgements [@lizpreciatior](https://huggingface.co/lizpreciatior) For creating lzlv [@Sao10K](https://huggingface.co/Sao10K) For creating WinterGoddess [@alpindale](https://huggingface.co/alpindale) For creating the original Goliath [@chargoddard](https://huggingface.co/chargoddard) For developing [mergekit](https://github.com/cg123/mergekit).