SkunkApe-16b / README.md
MarsupialAI's picture
Update README.md
4f4e648 verified
|
raw
history blame
1.76 kB
---
license: cc-by-nc-4.0
language:
- en
tags:
- solar
- rotating-stack-merge
---
# Skunk Ape 16b
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/65a531bc7ec6af0f95c707b1/p9tbuezkb2qvf8kWEnO_2.jpeg)
This model is a rotating-stack merge of three Solar-based models in a 16b (72 layer) configuration. The result of
this "frankenmerge" is a medium-sized model that contains what I consider to be the best of the solar finetunes.
Mergefuel:
- Sao10K/Fimbulvetr-11B-v2
- Sao10K/Solstice-11B-v1
- TheDrummer/Moistral-11B-v1
This model is uncensored and capable of generating objectionable material. However, it is not an explicitely-NSFW model,
and it has never "gone rogue" and tried to insert NSFW content into SFW prompts in my experience. As with any LLM, no
factual claims made by the model should be taken at face value. You know that boilerplate safety disclaimer that most
professional models have? Assume this has it too. This model is for entertainment purposes only.
iMatrix GGUFs:
# Sample output
```
{{[INPUT]}}
Write a detailed and humorous story about a cute and fluffy bunny that goes to a Gwar concert.
{{[OUTPUT]}}
```
# Prompt format
Prefers alpaca.
# WTF is a rotating-stack merge?
Inspired by Undi's experiments with stacked merges, Jeb Carter found that output quality and model initiative could be significantly
improved by reversing the model order in the stack, and then doing a linear merge between the original and reversed stacks. That is
what I did here. I created three passthrough stacked merges using the three source models (rotating the model order in each stack),
and then doing a linear merge of all three stacks. The exact merge configs can be found in the recipe.txt file.