--- base_model: - mistralai/Mistral-7B-Instruct-v0.2 - Nexusflow/Starling-LM-7B-beta library_name: transformers tags: - mergekit - merge - not-for-all-audiences --- ![](maid.jpeg) # Mistralv0.2-StarlingLM-FrankenMaid-10.5B This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details See Below ### Merge Method This model was merged using the passthrough merge method and also dare_ties method. ### Models Merged The following models were included in the merge: * [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) * [Nexusflow/Starling-LM-7B-beta](https://huggingface.co/Nexusflow/Starling-LM-7B-beta) * fM-v2 ### Configuration Works well with Alpaca Instruct Template The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: "fM-v2" layer_range: [0, 16] - sources: - model: "Nexusflow/Starling-LM-7B-beta" layer_range: [8, 24] - sources: - model: "mistralai/Mistral-7B-Instruct-v0.2" layer_range: [17, 32] merge_method: passthrough dtype: float16 name: fM-v2 ``` ```yaml models: - model: mistralai/Mistral-7B-Instruct-v0.2 - model: Franken-Maid parameters: weight: 0.25 density: 1.0 - model: Franken-Maid parameters: weight: 0.30 density: 1.0 merge_method: dare_ties base_model: mistralai/Mistral-7B-Instruct-v0.2 parameters: int8_mask: true dtype: bfloat16 name: fM-v2 ```