--- license: cc-by-4.0 language: - en base_model: - TeeZee/Orca-2-13b_flat - NeverSleep/X-NoroChronos-13B - NeverSleep/Noromaid-13b-v0.3 - KatyTheCutie/EstopianMaid-13B - Undi95/MLewdBoros-L2-13B - KoboldAI/LLaMA2-13B-Psyfighter2 - KoboldAI/LLaMA2-13B-Erebus-v3 library_name: transformers tags: - storywriting - text adventure - creative - story - writing - fiction - roleplaying - rp - mergekit - merge --- ![pic](https://huggingface.co/FallenMerick/Bionic-Cetacean-20B/resolve/main/Bionic-Cetacean.jpg) # Bionic-Cetacean-20B In the same vein as the legendary [Psyonic-Cetacean-20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B), I have attempted to create a 20B model that is equal parts creative and chaotic, while still remaining coherent enough for roleplaying purposes.
The three components used to create [Bionic-Vaquita-13B](https://huggingface.co/FallenMerick/Bionic-Vaquita-13B) were also used to create this stack.
Creativity and coherency were the primary focus of the late-stage manual testing that led to selecting this model.

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). GGUF Quants: * https://huggingface.co/backyardai/Bionic-Cetacean-20B-GGUF * https://huggingface.co/mradermacher/Bionic-Cetacean-20B-GGUF * https://huggingface.co/mradermacher/Bionic-Cetacean-20B-i1-GGUF ## Merge Details ### Merge Method This model was merged using the passthrough merge method. ### Models Merged The following models were included in the merge: * [FallenMerick/Psyfighter2-Orca2-Erebus3](https://huggingface.co/FallenMerick/Psyfighter2-Orca2-Erebus3-13B) * [FallenMerick/XNoroChronos-Orca2-Noromaid](https://huggingface.co/FallenMerick/XNoroChronos-Orca2-Noromaid-13B) * [FallenMerick/EstopianMaid-Orca2-MlewdBoros](https://huggingface.co/FallenMerick/EstopianMaid-Orca2-MlewdBoros-13B) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: FallenMerick/Psyfighter2-Orca2-Erebus3 layer_range: [0, 13] - sources: - model: FallenMerick/XNoroChronos-Orca2-Noromaid layer_range: [8, 26] - sources: - model: FallenMerick/EstopianMaid-Orca2-MlewdBoros layer_range: [14, 32] - sources: - model: FallenMerick/Psyfighter2-Orca2-Erebus3 layer_range: [27, 40] merge_method: passthrough dtype: bfloat16 ```