Update README.md
Browse files
README.md
CHANGED
@@ -61,7 +61,7 @@ model-index:
|
|
61 |
|
62 |
# Kotoba-Whisper
|
63 |
_Kotoba-Whisper_ is a collection of distilled [Whisper](https://arxiv.org/abs/2212.04356) models for Japanese ASR, developed through the collaboration bewteen
|
64 |
-
[Asahi Ushio](https://asahiushio.com) and [Kotoba Technologies](https://
|
65 |
Following the original work of distil-whisper ([Robust Knowledge Distillation via Large-Scale Pseudo Labelling](https://arxiv.org/abs/2311.00430)),
|
66 |
we employ OpenAI's [Whisper large-v3](https://huggingface.co/openai/whisper-large-v3) as the teacher model, and the student model consists the full encoder of the
|
67 |
teacher large-v3 model and the decoder with two layers initialized from the first and last layer of the large-v3 model.
|
|
|
61 |
|
62 |
# Kotoba-Whisper
|
63 |
_Kotoba-Whisper_ is a collection of distilled [Whisper](https://arxiv.org/abs/2212.04356) models for Japanese ASR, developed through the collaboration bewteen
|
64 |
+
[Asahi Ushio](https://asahiushio.com) and [Kotoba Technologies](https://twitter.com/kotoba_tech).
|
65 |
Following the original work of distil-whisper ([Robust Knowledge Distillation via Large-Scale Pseudo Labelling](https://arxiv.org/abs/2311.00430)),
|
66 |
we employ OpenAI's [Whisper large-v3](https://huggingface.co/openai/whisper-large-v3) as the teacher model, and the student model consists the full encoder of the
|
67 |
teacher large-v3 model and the decoder with two layers initialized from the first and last layer of the large-v3 model.
|