--- base_model: - hiig-piai/simba-v01c - DRXD1000/Phoenix - mistralai/Mistral-7B-v0.1 - OpenPipe/mistral-ft-optimized-1227 - VAGOsolutions/SauerkrautLM-7b-LaserChat library_name: transformers tags: - mergekit - merge --- # VerwaltungsAnthologie_clear_7B This model is used as an intermediate model for future merges. It is a merge of 4 pre-trained language models based upon Mistral-7B-v0.1 created using [mergekit](https://github.com/cg123/mergekit). I used it in a second step in combination with [DiscoLM_German_7b_v1](https://huggingface.co/DiscoResearch/DiscoLM_German_7b_v1) for the successor of 'talky_7B': [VerwaltungsAnthologie_Disco_7B](https://huggingface.co/MarcGrumpyOlejak/VerwaltungsAnthologie_Disco_7B) ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) as a base. ### Models Merged The following models were included in the merge: * [hiig-piai/simba-v01c](https://huggingface.co/hiig-piai/simba-v01c) * [DRXD1000/Phoenix](https://huggingface.co/DRXD1000/Phoenix) * [OpenPipe/mistral-ft-optimized-1227](https://huggingface.co/OpenPipe/mistral-ft-optimized-1227) * [VAGOsolutions/SauerkrautLM-7b-LaserChat](https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-LaserChat) ### Configuration The following YAML configuration was used to produce this model: ```yaml # works but never stops models: - model: mistralai/Mistral-7B-v0.1 # No parameters necessary for base model - model: VAGOsolutions/SauerkrautLM-7b-LaserChat parameters: density: 0.53 weight: 0.15 - model: hiig-piai/simba-v01c parameters: density: 0.53 weight: 0.55 - model: DRXD1000/Phoenix parameters: density: 0.53 weight: 0.15 - model: OpenPipe/mistral-ft-optimized-1227 parameters: density: 0.53 weight: 0.15 merge_method: dare_ties base_model: mistralai/Mistral-7B-v0.1 parameters: int8_mask: true dtype: bfloat16 name: VerwaltungsAnthologie_clear_7B ```