--- base_model: - jambroz/FNCARL-7b - HuggingFaceH4/mistral-7b-anthropic - jambroz/sixtyoneeighty-7b - mlabonne/UltraMerge-7B library_name: transformers tags: - mergekit - merge license: apache-2.0 --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [jambroz/sixtyoneeighty-7b](https://huggingface.co/jambroz/sixtyoneeighty-7b) as a base. ### Models Merged The following models were included in the merge: * [jambroz/FNCARL-7b](https://huggingface.co/jambroz/FNCARL-7b) * [HuggingFaceH4/mistral-7b-anthropic](https://huggingface.co/HuggingFaceH4/mistral-7b-anthropic) * [mlabonne/UltraMerge-7B](https://huggingface.co/mlabonne/UltraMerge-7B) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: jambroz/sixtyoneeighty-7b dtype: bfloat16 merge_method: dare_ties models: - model: jambroz/sixtyoneeighty-7b - model: mlabonne/UltraMerge-7B parameters: density: '0.53' weight: '0.4' - model: HuggingFaceH4/mistral-7b-anthropic parameters: density: '0.53' weight: '0.3' - model: jambroz/FNCARL-7b parameters: density: '0.53' weight: '0.3' parameters: int8_mask: true ```