Update README.md
Browse files
README.md
CHANGED
@@ -4,8 +4,8 @@
|
|
4 |
|
5 |
# Mixtral MOE 2x34B
|
6 |
|
7 |
-
* [One of Best Model reviewed by reddit
|
8 |
-
* [Another review by reddit
|
9 |
|
10 |
Highest score Model ranked by Open LLM Leaderboard (2024-01-10)
|
11 |
* [Average Score 76.66](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
|
|
4 |
|
5 |
# Mixtral MOE 2x34B
|
6 |
|
7 |
+
* [One of Best Model reviewed by reddit community](https://www.reddit.com/r/LocalLLaMA/comments/1916896/llm_comparisontest_confirm_leaderboard_big_news/)
|
8 |
+
* [Another review by reddit community](https://www.reddit.com/r/LocalLLaMA/comments/191mvlp/i_have_tried_mixtral_34bx2_moe_also_named_yi/)
|
9 |
|
10 |
Highest score Model ranked by Open LLM Leaderboard (2024-01-10)
|
11 |
* [Average Score 76.66](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|