--- base_model: - prithivMLmods/Messier-Opus-14B-Elite7 - prithivMLmods/Equuleus-Opus-14B-Exp - Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v3 - Sakalti/Saka-14B - Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7 - sometimesanotion/Lamarck-14B-v0.7-Fusion - sometimesanotion/LamarckInfusion-14B-v1 - prithivMLmods/Sombrero-Opus-14B-Elite6 library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [SCE](https://arxiv.org/abs/2408.07990) merge method using [Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7](https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7) as a base. ### Models Merged The following models were included in the merge: * [prithivMLmods/Messier-Opus-14B-Elite7](https://huggingface.co/prithivMLmods/Messier-Opus-14B-Elite7) * [prithivMLmods/Equuleus-Opus-14B-Exp](https://huggingface.co/prithivMLmods/Equuleus-Opus-14B-Exp) * [Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v3](https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v3) * [Sakalti/Saka-14B](https://huggingface.co/Sakalti/Saka-14B) * [sometimesanotion/Lamarck-14B-v0.7-Fusion](https://huggingface.co/sometimesanotion/Lamarck-14B-v0.7-Fusion) * [sometimesanotion/LamarckInfusion-14B-v1](https://huggingface.co/sometimesanotion/LamarckInfusion-14B-v1) * [prithivMLmods/Sombrero-Opus-14B-Elite6](https://huggingface.co/prithivMLmods/Sombrero-Opus-14B-Elite6) ### Configuration The following YAML configuration was used to produce this model: ```yaml name: NQLSG-Qwen2.5-14B-MegaFusion-v8 models: - model: Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v3 - model: Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7 - model: prithivMLmods/Equuleus-Opus-14B-Exp - model: prithivMLmods/Messier-Opus-14B-Elite7 - model: prithivMLmods/Sombrero-Opus-14B-Elite6 - model: Sakalti/Saka-14B - model: sometimesanotion/Lamarck-14B-v0.7-Fusion - model: sometimesanotion/LamarckInfusion-14B-v1 base_model: Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v7 chat_template: auto dtype: bfloat16 merge_method: sce parameters: int8_mask: true tokenizer: source: union ```