Update README.md
Browse files
README.md
CHANGED
@@ -84,6 +84,7 @@ widget:
|
|
84 |
Mixnueza-6x32M-MoE is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
|
85 |
* 3 X [Felladrin/Minueza-32M-Base](https://huggingface.co/Felladrin/Minueza-32M-Base)
|
86 |
* 3 X [Felladrin/Minueza-32M-UltraChat](https://huggingface.co/Felladrin/Minueza-32M-UltraChat)
|
|
|
87 |
|
88 |
## Recommended Prompt Format
|
89 |
|
|
|
84 |
Mixnueza-6x32M-MoE is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
|
85 |
* 3 X [Felladrin/Minueza-32M-Base](https://huggingface.co/Felladrin/Minueza-32M-Base)
|
86 |
* 3 X [Felladrin/Minueza-32M-UltraChat](https://huggingface.co/Felladrin/Minueza-32M-UltraChat)
|
87 |
+
* [Evaluation Results](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Mixnueza-6x32M-MoE)
|
88 |
|
89 |
## Recommended Prompt Format
|
90 |
|