KoichiYasuoka commited on
Commit
eed53cd
·
1 Parent(s): 92bfe35
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -19,7 +19,7 @@ widget:
19
 
20
  ## Model Description
21
 
22
- This is a ModernBERT model pre-trained on Japanese Wikipedia and 青空文庫 texts. NVIDIA A100-SXM4-40GB×8 took 56 hours 49 minutes for training. You can fine-tune `modernbert-base-japanese-wikipedia` for downstream tasks, such as POS-tagging, dependency-parsing, and so on.
23
 
24
  ## How to Use
25
 
 
19
 
20
  ## Model Description
21
 
22
+ This is a ModernBERT model pre-trained on Japanese Wikipedia and 青空文庫 texts. NVIDIA A100-SXM4-40GB×8 took 56 hours 49 minutes for training. You can fine-tune `modernbert-base-japanese-wikipedia` for downstream tasks, such as [POS-tagging](https://huggingface.co/KoichiYasuoka/modernbert-base-japanese-wikipedia-upos), [dependency-parsing](https://huggingface.co/KoichiYasuoka/modernbert-base-japanese-wikipedia-ud-square), and so on.
23
 
24
  ## How to Use
25