--- library_name: transformers tags: - mergekit - merge --- I have no idea what I’m doing… if this causes the apocalypse someone please let me know. MN-12B-Starcannon-v3 8.0bpw h8 EXL2 Includes [measurement.json](https://huggingface.co/FuturisticVibes/MN-12B-Starcannon-v3-8.0bpw-h8-exl2/tree/measurement) file for further quantization Original Model: https://huggingface.co/aetherwiing/MN-12B-Starcannon-v3 # Original Model Card # Mistral Nemo 12B Starcannon v3 This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). Artbitrary update, because I know that people would request it. Didn't have much time to test it, tbh, but feels nice enough? It's up to y'all to decide if it's an upgrade, sidegrade or downgrade. At least now both models have ChatML trained, there's that.
[Static GGUF (by Mradermacher)](https://huggingface.co/mradermacher/MN-12B-Starcannon-v3-GGUF)
[Imatrix GGUF (by Mradermacher)](https://huggingface.co/mradermacher/MN-12B-Starcannon-v3-i1-GGUF)
[EXL2 (by kingbri of RoyalLab)](https://huggingface.co/royallab/MN-12B-Starcannon-v3-exl2) ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [nothingiisreal/MN-12B-Celeste-V1.9](https://huggingface.co/nothingiisreal/MN-12B-Celeste-V1.9) as a base. ### Merge Fodder The following models were included in the merge: * [anthracite-org/magnum-12b-v2](https://huggingface.co/anthracite-org/magnum-12b-v2) * [nothingiisreal/MN-12B-Celeste-V1.9](https://huggingface.co/nothingiisreal/MN-12B-Celeste-V1.9) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: anthracite-org/magnum-12b-v2 parameters: density: 0.3 weight: 0.5 - model: nothingiisreal/MN-12B-Celeste-V1.9 parameters: density: 0.7 weight: 0.5 merge_method: ties base_model: nothingiisreal/MN-12B-Celeste-V1.9 parameters: normalize: true int8_mask: true dtype: bfloat16 ```