--- base_model: - NeverSleep/Lumimaid-v0.2-12B - intervitens/mini-magnum-12b-v1.1 - nbeerbower/mistral-nemo-gutenberg-12B-v4 - natong19/Mistral-Nemo-Instruct-2407-abliterated - Lambent/arsenic-nemo-unleashed-12B library_name: transformers tags: - mergekit - merge --- ![image/png](https://huggingface.co/yamatazen/Amelia-SCE-12B/resolve/main/Amelia-SCE-12B.png?download=true) This model was created without merging ChatML models. # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [SCE](https://arxiv.org/abs/2408.07990) merge method using [natong19/Mistral-Nemo-Instruct-2407-abliterated](https://huggingface.co/natong19/Mistral-Nemo-Instruct-2407-abliterated) as a base. ### Models Merged The following models were included in the merge: * [NeverSleep/Lumimaid-v0.2-12B](https://huggingface.co/NeverSleep/Lumimaid-v0.2-12B) * [intervitens/mini-magnum-12b-v1.1](https://huggingface.co/intervitens/mini-magnum-12b-v1.1) * [nbeerbower/mistral-nemo-gutenberg-12B-v4](https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B-v4) * [Lambent/arsenic-nemo-unleashed-12B](https://huggingface.co/Lambent/arsenic-nemo-unleashed-12B) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: natong19/Mistral-Nemo-Instruct-2407-abliterated models: - model: Lambent/arsenic-nemo-unleashed-12B - model: nbeerbower/mistral-nemo-gutenberg-12B-v4 - model: NeverSleep/Lumimaid-v0.2-12B - model: intervitens/mini-magnum-12b-v1.1 merge_method: sce dtype: bfloat16 parameters: normalize: true select_topk: 0.5 ```