File size: 2,467 Bytes
41951bb 5ed7fc6 41951bb 5ed7fc6 41951bb 54e6b8c 41951bb |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 |
---
tags:
- moe
- llama
- '3'
- llama 3
- 4x8b
---
# GGUF files of [Llama-3-Peach-Instruct-4x8B-MoE](https://huggingface.co/RDson/Llama-3-Peach-Instruct-4x8B-MoE).
# Llama-3-Peach-Instruct-4x8B-MoE
<img src="https://i.imgur.com/MlnauLb.jpeg" width="640"/>
This is a experimental MoE created using Mergekit from
* [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct)
* [Salesforce/SFR-Iterative-DPO-LLaMA-3-8B-R](https://huggingface.co/Salesforce/SFR-Iterative-DPO-LLaMA-3-8B-R)
* [NousResearch/Hermes-2-Theta-Llama-3-8B](https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B)
* [rombodawg/Llama-3-8B-Instruct-Coder](https://huggingface.co/rombodawg/Llama-3-8B-Instruct-Coder)
Evaluation:
Q4_K_M:
* GSM8K (5-shot): 0.6983 ± 0.0126
* GSM8K (8-shot, cot): 0.674 ± 0.0129
Mergekit yaml file:
```
base_model: Meta-Llama-3-8B-Instruct
experts:
- source_model: Meta-Llama-3-8B-Instruct
positive_prompts:
- "explain"
- "chat"
- "assistant"
- "think"
- "roleplay"
- "versatile"
- "helpful"
- "factual"
- "integrated"
- "adaptive"
- "comprehensive"
- "balanced"
negative_prompts:
- "specialized"
- "narrow"
- "focused"
- "limited"
- "specific"
- source_model: Llama-3-8B-Instruct-Coder
positive_prompts:
- "python"
- "math"
- "solve"
- "code"
- "programming"
- "javascript"
- "algorithm"
- "factual"
negative_prompts:
- "sorry"
- "cannot"
- "concise"
- "imaginative"
- "creative"
- source_model: SFR-Iterative-DPO-LLaMA-3-8B-R
positive_prompts:
- "AI"
- "instructive"
- "chat"
- "assistant"
- "clear"
- "directive"
- "helpful"
- "informative"
- source_model: Hermes-2-Theta-Llama-3-8B
positive_prompts:
- "chat"
- "assistant"
- "analytical"
- "accurate"
- "code"
- "logical"
- "knowledgeable"
- "precise"
- "calculate"
- "compute"
- "solve"
- "work"
- "python"
- "javascript"
- "programming"
- "algorithm"
- "tell me"
- "assistant"
- "factual"
negative_prompts:
- "abstract"
- "artistic"
- "emotional"
- "mistake"
- "inaccurate"
gate_mode: hidden
dtype: float16
```
Some inspiration for the Mergekit yaml file is from [LoneStriker/Umbra-MoE-4x10.7-2.4bpw-h6-exl2](https://huggingface.co/LoneStriker/Umbra-MoE-4x10.7-2.4bpw-h6-exl2). |