license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- ties | |
- "7B" | |
- "eren23/OGNO-7b-dpo-truthful" | |
- "Kquant03/NeuralTrix-7B-dpo-laser" | |
# FMixIA-7B-TIES-1 | |
A merged model using Task Interpolation for Efficient Scaling (TIES) using [mergekit](https://github.com/cg123/mergekit). | |
## Model Details | |
- **Base Models**: | |
* [eren23/OGNO-7b-dpo-truthful](https://huggingface.co/eren23/OGNO-7b-dpo-truthful) | |
* [Kquant03/NeuralTrix-7B-dpo-laser](https://huggingface.co/Kquant03/NeuralTrix-7B-dpo-laser) | |
- **Merge Method**: ties | |
## Configuration | |
```yaml | |
models: | |
- model: eren23/OGNO-7b-dpo-truthful | |
- model: Kquant03/NeuralTrix-7B-dpo-laser | |
parameters: | |
density: 0.5 | |
weight: 0.5 | |
merge_method: ties | |
base_model: eren23/OGNO-7b-dpo-truthful | |
parameters: | |
normalize: true | |
dtype: float16 | |
``` | |
## Usage | |
This model can be used with the standard transformers library: | |
```python | |
from transformers import AutoModelForCausalLM, AutoTokenizer | |
model = AutoModelForCausalLM.from_pretrained("Ro-xe/FMixIA-7B-TIES-1") | |
tokenizer = AutoTokenizer.from_pretrained("Ro-xe/FMixIA-7B-TIES-1") | |
``` |