ykhwang commited on
Commit
93407a2
·
1 Parent(s): dc2109e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -49,7 +49,7 @@ We evaluate 42dot-PLM on a variety of academic benchmarks both on Korean and Eng
49
  <img src="https://huggingface.co/42dot/42dot-plm-1.3b/resolve/main/asset/plm_benchmark_ko.png" width="90%" height="90%"/>
50
  </figure>
51
 
52
- |Tasks / Macro-F1|[KoGPT2](https://github.com/SKT-AI/KoGPT2) <br>1.2B|[Polyglot-Ko](https://github.com/EleutherAI/polyglot) <br>1.3B|[XGLM](https://huggingface.co/facebook/xglm-1.7B) <br>1.7B|[PolyLM](https://huggingface.co/DAMO-NLP-MT/polylm-1.7b) <br>1.7B|42dot-PLM <br>1.3B ko-en|
53
  |--------------|-----------|----------------|---------|-----------|------------------------|
54
  |boolq |0.337 |0.355 |**0.502** |0.334 |0.351 |
55
  |copa |0.67 |**0.721** |0.616 |0.513 |0.711 |
@@ -64,7 +64,7 @@ We evaluate 42dot-PLM on a variety of academic benchmarks both on Korean and Eng
64
  <img src="https://huggingface.co/42dot/42dot-plm-1.3b/resolve/main/asset/plm_benchmark_en.png" width="90%" height="90%"/>
65
  </figure>
66
 
67
- | Tasks / Metric | [MPT](https://huggingface.co/mosaicml/mpt-1b-redpajama-200b) <br>1B | [OPT](https://huggingface.co/facebook/opt-1.3b) <br>1.3B | XGLM <br>1.7B | PolyLM <br>1.7B | 42dot-PLM <br>1.3B ko-en |
68
  | ---------------------- | ------ | -------- | --------- | ----------- | ------------------------ |
69
  | anli_r1/acc | 0.309 | **0.341** | 0.334 | 0.336 | 0.328 |
70
  | anli_r2/acc | 0.334 | **0.339** | 0.331 | 0.314 | 0.334 |
 
49
  <img src="https://huggingface.co/42dot/42dot-plm-1.3b/resolve/main/asset/plm_benchmark_ko.png" width="90%" height="90%"/>
50
  </figure>
51
 
52
+ |Tasks / Macro-F1|[KoGPT2](https://github.com/SKT-AI/KoGPT2) <br>1.2B|[Polyglot-Ko](https://github.com/EleutherAI/polyglot) <br>1.3B|[XGLM](https://huggingface.co/facebook/xglm-1.7B) <br>1.7B|[PolyLM](https://huggingface.co/DAMO-NLP-MT/polylm-1.7b) <br>1.7B|42dot-PLM <br>1.3B|
53
  |--------------|-----------|----------------|---------|-----------|------------------------|
54
  |boolq |0.337 |0.355 |**0.502** |0.334 |0.351 |
55
  |copa |0.67 |**0.721** |0.616 |0.513 |0.711 |
 
64
  <img src="https://huggingface.co/42dot/42dot-plm-1.3b/resolve/main/asset/plm_benchmark_en.png" width="90%" height="90%"/>
65
  </figure>
66
 
67
+ | Tasks / Metric | [MPT](https://huggingface.co/mosaicml/mpt-1b-redpajama-200b) <br>1B | [OPT](https://huggingface.co/facebook/opt-1.3b) <br>1.3B | XGLM <br>1.7B | PolyLM <br>1.7B | 42dot-PLM <br>1.3B |
68
  | ---------------------- | ------ | -------- | --------- | ----------- | ------------------------ |
69
  | anli_r1/acc | 0.309 | **0.341** | 0.334 | 0.336 | 0.328 |
70
  | anli_r2/acc | 0.334 | **0.339** | 0.331 | 0.314 | 0.334 |