Commit
·
02f63ad
1
Parent(s):
5d20ce8
Update README.md
Browse files
README.md
CHANGED
@@ -10,7 +10,7 @@ This is RoBERTa model pretrained on texts in the Japanese language.
|
|
10 |
|
11 |
3.45GB wikipedia text
|
12 |
|
13 |
-
trained
|
14 |
|
15 |
use the BERT BPE tokenizer.
|
16 |
|
|
|
10 |
|
11 |
3.45GB wikipedia text
|
12 |
|
13 |
+
trained 125M step
|
14 |
|
15 |
use the BERT BPE tokenizer.
|
16 |
|