--- base_model: - KidIkaros/Llama-3.2-1B-Instruct-abliterated - Nexesenex/Llama_3.2_1b_Dolto_0.1 - Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1 library_name: transformers tags: - mergekit - merge license: llama3.2 --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [KidIkaros/Llama-3.2-1B-Instruct-abliterated](https://huggingface.co/KidIkaros/Llama-3.2-1B-Instruct-abliterated) as a base. ### Models Merged The following models were included in the merge: * [Nexesenex/Llama_3.2_1b_Dolto_0.1](https://huggingface.co/Nexesenex/Llama_3.2_1b_Dolto_0.1) * [Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1](https://huggingface.co/Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: model_stock models: - model: Nexesenex/Llama_3.2_1b_Dolto_0.1 parameters: weight: 1.0 - model: Nexesenex/Llama_3.2_1b_OpenTree_R1_0.1 parameters: weight: 1.0 base_model: KidIkaros/Llama-3.2-1B-Instruct-abliterated dtype: bfloat16 normalize: true ```