--- license: cc-by-nc-4.0 base_model: - PocketDoc/Dans-PersonalityEngine-V1.1.0-12b - inflatebot/MN-12B-Mag-Mell-R1 library_name: transformers tags: - mergekit - merge --- GreenSnake.img Version: [WhiteSnake](https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake) - [Orochi](https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-Orochi) - [GreenSnake](#) # What is it? Previous version of WhiteSnake, not too much different in OpenLLM LeaderBoard scores. Not too good to archiving 'human response', but still good enough. This merge model is a gift for Lunar New Year, haha. Enjoy it. Good for RP, ERP, Story Telling. # Chat Format? ChatML of course!
Merge Details

### Models Merged The following models were included in the merge: * [PocketDoc/Dans-PersonalityEngine-V1.1.0-12b](https://huggingface.co/PocketDoc/Dans-PersonalityEngine-V1.1.0-12b) * [inflatebot/MN-12B-Mag-Mell-R1](https://huggingface.co/inflatebot/MN-12B-Mag-Mell-R1) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: inflatebot/MN-12B-Mag-Mell-R1 - model: PocketDoc/Dans-PersonalityEngine-V1.1.0-12b merge_method: slerp base_model: inflatebot/MN-12B-Mag-Mell-R1 parameters: t: [0.1, 0.2, 0.4, 0.6, 0.6, 0.4, 0.2, 0.1] dtype: bfloat16 tokenizer_source: base ```