license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- lazymergekit | |
- NousResearch/Llama-2-7b-chat-hf | |
- huggyllama/llama-7b | |
# demo | |
demo is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): | |
* [NousResearch/Llama-2-7b-chat-hf](https://huggingface.co/NousResearch/Llama-2-7b-chat-hf) | |
* [huggyllama/llama-7b](https://huggingface.co/huggyllama/llama-7b) | |
## 🧩 Configuration | |
```yaml | |
slices: | |
- sources: | |
- model: NousResearch/Llama-2-7b-chat-hf | |
layer_range: [0, 32] | |
- model: huggyllama/llama-7b | |
layer_range: [0, 32] | |
merge_method: slerp | |
base_model: NousResearch/Llama-2-7b-chat-hf | |
parameters: | |
t: | |
- filter: self_attn | |
value: [0, 0.5, 0.3, 0.7, 1] | |
- filter: mlp | |
value: [1, 0.5, 0.7, 0.3, 0] | |
- value: 0.5 | |
dtype: bfloat16 | |
``` |