File size: 1,482 Bytes
15003d3 4c69fbf 15aa471 15003d3 25f919a 15003d3 d5b8a03 ed056e1 15003d3 41b539e d5b8a03 898f45d 62644c1 15003d3 d5b8a03 15003d3 d5b8a03 15aa471 15003d3 ed056e1 15003d3 15aa471 15003d3 25f919a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 |
---
base_model:
- ABX-AI/Cerebral-Infinity-7B
- ABX-AI/Spicy-Laymonade-7B
library_name: transformers
tags:
- mergekit
- merge
- not-for-all-audiences
---

# Cosmic-Citrus-9B
Another attempt at merging Cerebrum, InfinityRP, LemonadeRP, and Laymonade, all already merged in my previous merges, now into a 9B containing TheSpice.
So far in my tests, it seems to follow my cards in intriguing way, using refined language, with more consideration of what the prompt is saying.
In fact, I'm quite positively surprised as the creativity surpassed my expectations. It's quickly becoming a favorite to use for me.
## Merge Details
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* [ABX-AI/Cerebral-Infinity-7B](https://huggingface.co/ABX-AI/Cerebral-Infinity-7B)
* [ABX-AI/Spicy-Laymonade-7B](https://huggingface.co/ABX-AI/Spicy-Laymonade-7B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: ABX-AI/Cerebral-Infinity-7B
layer_range: [0, 20]
- sources:
- model: ABX-AI/Spicy-Laymonade-7B
layer_range: [12, 32]
merge_method: passthrough
dtype: float16
``` |