Commit
·
9cefb87
1
Parent(s):
954ede7
Update README.md
Browse files
README.md
CHANGED
@@ -1,15 +1,21 @@
|
|
1 |
---
|
2 |
language: ja
|
3 |
tags:
|
4 |
-
|
5 |
-
license:
|
6 |
---
|
7 |
|
8 |
# distilhubert-ft-japanese-50k
|
9 |
|
10 |
Fine-tuned (more precisely, continue trained) 50k steps model on Japanese using the [JVS corpus](https://sites.google.com/site/shinnosuketakamichi/research-topics/jvs_corpus), [Tsukuyomi-Chan corpus](https://tyc.rei-yumesaki.net/material/corpus/), [Amitaro's ITA corpus V2.1](https://amitaro.net/), and recorded my own read [ITA corpus](https://github.com/mmorise/ita-corpus).
|
11 |
|
|
|
12 |
|
|
|
|
|
|
|
|
|
|
|
13 |
Original repos, Many thanks!:
|
14 |
[S3PRL](https://github.com/s3prl/s3prl/tree/main/s3prl/pretrain)
|
15 |
- Using this when training (with little modify for train using own datasets).
|
@@ -40,4 +46,4 @@ Note: This is not the best checkpoint and become more accurate with continued tr
|
|
40 |
```
|
41 |
[https://amitaro.net/](https://amitaro.net/)
|
42 |
|
43 |
-
Thanks!
|
|
|
1 |
---
|
2 |
language: ja
|
3 |
tags:
|
4 |
+
- speech
|
5 |
+
license: other
|
6 |
---
|
7 |
|
8 |
# distilhubert-ft-japanese-50k
|
9 |
|
10 |
Fine-tuned (more precisely, continue trained) 50k steps model on Japanese using the [JVS corpus](https://sites.google.com/site/shinnosuketakamichi/research-topics/jvs_corpus), [Tsukuyomi-Chan corpus](https://tyc.rei-yumesaki.net/material/corpus/), [Amitaro's ITA corpus V2.1](https://amitaro.net/), and recorded my own read [ITA corpus](https://github.com/mmorise/ita-corpus).
|
11 |
|
12 |
+
## Attention
|
13 |
|
14 |
+
This checkpoint was used the [JVS corpus](https://sites.google.com/site/shinnosuketakamichi/research-topics/jvs_corpus) when training. Please read and accept the [terms of use](https://sites.google.com/site/shinnosuketakamichi/research-topics/jvs_corpus#h.p_OP_G8FT_Kuf4)
|
15 |
+
(This terms of use was also applies this checkpoint. This means also applies this "terms of use" when you use this checkpoint with another project etc...)
|
16 |
+
|
17 |
+
|
18 |
+
# References
|
19 |
Original repos, Many thanks!:
|
20 |
[S3PRL](https://github.com/s3prl/s3prl/tree/main/s3prl/pretrain)
|
21 |
- Using this when training (with little modify for train using own datasets).
|
|
|
46 |
```
|
47 |
[https://amitaro.net/](https://amitaro.net/)
|
48 |
|
49 |
+
Thanks!
|