KoichiYasuoka commited on
Commit
f074ee4
1 Parent(s): 5b2037a

base_model

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -7,6 +7,7 @@ tags:
7
  - "pos"
8
  - "wikipedia"
9
  - "dependency-parsing"
 
10
  datasets:
11
  - "universal_dependencies"
12
  license: "cc-by-sa-4.0"
@@ -19,7 +20,7 @@ widget:
19
 
20
  ## Model Description
21
 
22
- This is a BERT model pre-trained on Japanese Wikipedia texts for POS-tagging and dependency-parsing, derived from [bert-large-japanese](https://huggingface.co/cl-tohoku/bert-large-japanese). Every long-unit-word is tagged by [UPOS](https://universaldependencies.org/u/pos/) (Universal Part-Of-Speech).
23
 
24
  ## How to Use
25
 
 
7
  - "pos"
8
  - "wikipedia"
9
  - "dependency-parsing"
10
+ base_model: tohoku-nlp/bert-large-japanese
11
  datasets:
12
  - "universal_dependencies"
13
  license: "cc-by-sa-4.0"
 
20
 
21
  ## Model Description
22
 
23
+ This is a BERT model pre-trained on Japanese Wikipedia texts for POS-tagging and dependency-parsing, derived from [bert-large-japanese](https://huggingface.co/tohoku-nlp/bert-large-japanese). Every long-unit-word is tagged by [UPOS](https://universaldependencies.org/u/pos/) (Universal Part-Of-Speech).
24
 
25
  ## How to Use
26