Magro-7b-v1.1 / README.md
Sakalti's picture
Update README.md
e35ed13 verified
---
base_model:
- Sakalti/magro-7B
- HuggingFaceH4/zephyr-7b-alpha
library_name: transformers
tags:
- mergekit
- merge
inference: true
license: mit
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [HuggingFaceH4/zephyr-7b-alpha](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha) as a base.
### Models Merged
The following models were included in the merge:
* [Sakalti/magro-7B](https://huggingface.co/Sakalti/magro-7B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: Sakalti/magro-7B
parameters:
weight: 1
density: 1
merge_method: ties
base_model: HuggingFaceH4/zephyr-7b-alpha
parameters:
weight: 1
density: 1
normalize: true
int8_mask: true
dtype: bfloat16
```