--- license: cc-by-4.0 language: - en base_model: - Undi95/BigL-7B - saishf/Multi-Verse-RP-7B - KatyTheCutie/LemonadeRP-4.5.3 - icefog72/IceLemonTeaRP-32k-7b - SanjiWatsuki/Kunoichi-DPO-v2-7B library_name: transformers tags: - mergekit - merge - mistral - text-generation - roleplay --- # Iced Lemon Cookie This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). GGUF quants: https://huggingface.co/FaradayDotDev/Iced-Lemon-Cookie-7B-GGUF ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [saishf/Multi-Verse-RP-7B](https://huggingface.co/saishf/Multi-Verse-RP-7B) as a base. ### Models Merged The following models were included in the merge: * [Undi95/BigL-7B](https://huggingface.co/Undi95/BigL-7B) * [KatyTheCutie/LemonadeRP-4.5.3](https://huggingface.co/KatyTheCutie/LemonadeRP-4.5.3) * [icefog72/IceLemonTeaRP-32k-7b](https://huggingface.co/icefog72/IceLemonTeaRP-32k-7b) * [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: icefog72/IceLemonTeaRP-32k-7b parameters: density: 1.0 weight: 1.0 - model: Undi95/BigL-7B parameters: density: 0.4 weight: 1.0 - model: SanjiWatsuki/Kunoichi-DPO-v2-7B parameters: density: 0.6 weight: 1.0 - model: KatyTheCutie/LemonadeRP-4.5.3 parameters: density: 0.8 weight: 1.0 merge_method: ties base_model: saishf/Multi-Verse-RP-7B parameters: normalize: true dtype: float16 ```