Commit
·
bcdfc3c
1
Parent(s):
fccd79b
Update README.md
Browse filesPopulating more results in the table.
README.md
CHANGED
@@ -5,7 +5,6 @@ language:
|
|
5 |
pipeline_tag: text-generation
|
6 |
library_name: transformers
|
7 |
tags:
|
8 |
-
- nlp
|
9 |
- llm
|
10 |
- code
|
11 |
---
|
@@ -14,10 +13,14 @@ tags:
|
|
14 |
|
15 |
CrystalCoder is a state-of-the-art 7B parameter language model, distinctively trained on the SlimPajama and StarCoder datasets. This model excels in balancing natural language processing and coding capabilities. Despite being trained on a smaller dataset of 1.4 trillion tokens—compared to LLaMA 2's 2 trillion—CrystalCoder surpasses LLaMA 2 in some challenging English and coding tasks. It demonstrates superior performance in benchmarks like MMLU, HumanEval, and MBPP.
|
16 |
|
17 |
-
| Model | Trained Tokens | MMLU (5-shot) | HumanEval (pass@1) | MBPP (pass@1) |
|
18 |
-
| --- | --- | --- | --- | --- |
|
19 |
-
|
|
20 |
-
|
|
|
|
|
|
|
|
|
|
21 |
|
22 |
## About LLM360
|
23 |
LLM360 is an initiative for comprehensive and fully open-sourced LLMs,
|
|
|
5 |
pipeline_tag: text-generation
|
6 |
library_name: transformers
|
7 |
tags:
|
|
|
8 |
- llm
|
9 |
- code
|
10 |
---
|
|
|
13 |
|
14 |
CrystalCoder is a state-of-the-art 7B parameter language model, distinctively trained on the SlimPajama and StarCoder datasets. This model excels in balancing natural language processing and coding capabilities. Despite being trained on a smaller dataset of 1.4 trillion tokens—compared to LLaMA 2's 2 trillion—CrystalCoder surpasses LLaMA 2 in some challenging English and coding tasks. It demonstrates superior performance in benchmarks like MMLU, HumanEval, and MBPP.
|
15 |
|
16 |
+
| Model | Trained Tokens | ARC | HellaSwag | MMLU (5-shot) | TruthfulQA | Language Avg. | HumanEval (pass@1) | MBPP (pass@1) | Coding Avg. | Avg. of Avg.|
|
17 |
+
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
18 |
+
| Mistral 7B | - | 59.98 | 83.31 | 64.16 | 42.15 | 63.40 | 29.12 | 38.78 | 33.95 | 48.68 |
|
19 |
+
| ** CrystalCoder 7B ** | 1.4T | 47.01 | 71.97 | 48.78 | 35.91 | 50.92 | 28.38 | 36.38 | 32.38 | 41.65 |
|
20 |
+
| CodeLlaMA 7B | 2.5T | 39.93 | 60.80 | 31.12 | 37.82 | 42.42 | 33.50 | 41.40 | 37.45 | 39.94 |
|
21 |
+
| OpenLLaMA v2 7B | 1T | 43.60 | 72.20 | 41.29 | 35.54 | 48.18 | 15.32 | 12.69 | 28.01 | 38.10 |
|
22 |
+
| LLaMA 2 7B | 2T | 53.07 | 77.74 | 43.80 | 38.98 | 53.39 | 13.05 | 20.09 | 16.57 | 34.98 |
|
23 |
+
| StarCoder-15B | 1.03 | - | - | - | - | - | 33.63 | 43.28 | 38.46 | - |
|
24 |
|
25 |
## About LLM360
|
26 |
LLM360 is an initiative for comprehensive and fully open-sourced LLMs,
|