license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- vortexmergekit | |
- OEvortex/HelpingAI2-6B | |
# 4x1b | |
Hey there! π Welcome to the 4x1b! This is a merge of multiple models brought together using the awesome [VortexMerge kit](https://colab.research.google.com/drive/1YjcvCLuNG1PK7Le6_4xhVU5VpzTwvGhk#scrollTo=UG5H2TK4gVyl). | |
Let's see what we've got in this merge: | |
* [OEvortex/HelpingAI2-6B](https://huggingface.co/OEvortex/HelpingAI2-6B) π | |
* [OEvortex/HelpingAI2-6B](https://huggingface.co/OEvortex/HelpingAI2-6B) π | |
* [OEvortex/HelpingAI2-6B](https://huggingface.co/OEvortex/HelpingAI2-6B) π | |
* [OEvortex/HelpingAI2-6B](https://huggingface.co/OEvortex/HelpingAI2-6B) π | |
## 𧩠Configuration | |
```yaml | |
dtype: float16 | |
merge_method: passthrough | |
slices: | |
- sources: | |
- layer_range: [0, 8] | |
model: OEvortex/HelpingAI2-6B | |
- sources: | |
- layer_range: [4, 12] | |
model: OEvortex/HelpingAI2-6B | |
- sources: | |
- layer_range: [8, 16] | |
model: OEvortex/HelpingAI2-6B | |
- sources: | |
- layer_range: [12, 21] | |
model: OEvortex/HelpingAI2-6B | |