Update README.md
Browse files
README.md
CHANGED
@@ -1,5 +1,17 @@
|
|
1 |
---
|
2 |
license: cc-by-sa-4.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
4 |
# japanese-gpt2-medium-unidic
|
5 |
This is a medium-sized Japanese GPT-2 model using BERT-like tokenizer.
|
@@ -65,4 +77,4 @@ The vocabulary size is 32771 (32768 original tokens + 2 special tokens + 1 unuse
|
|
65 |
|
66 |
Copyright (c) 2021, Tohoku University
|
67 |
|
68 |
-
Copyright (c) 2023, Tokyo Institute of Technology
|
|
|
1 |
---
|
2 |
license: cc-by-sa-4.0
|
3 |
+
datasets:
|
4 |
+
- wikipedia
|
5 |
+
- cc100
|
6 |
+
language:
|
7 |
+
- ja
|
8 |
+
pipeline_tag: text-generation
|
9 |
+
tags:
|
10 |
+
- gpt
|
11 |
+
- japanese
|
12 |
+
- language model
|
13 |
+
widget:
|
14 |
+
- text: 今日はいい天気なので、
|
15 |
---
|
16 |
# japanese-gpt2-medium-unidic
|
17 |
This is a medium-sized Japanese GPT-2 model using BERT-like tokenizer.
|
|
|
77 |
|
78 |
Copyright (c) 2021, Tohoku University
|
79 |
|
80 |
+
Copyright (c) 2023, Tokyo Institute of Technology
|