Fine-Tuned-Whisper-Large-v2-Zeroth-STT-KO
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3667
- Wer: 26.5596
- Cer: 0.3686
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_bnb_8bit with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- training_steps: 1500
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.2835 | 0.6944 | 200 | 0.3069 | 26.3590 | 0.3688 |
0.2194 | 1.3889 | 400 | 0.3056 | 26.4986 | 0.3670 |
0.1549 | 2.0833 | 600 | 0.3249 | 26.9697 | 0.3659 |
0.0791 | 2.7778 | 800 | 0.3215 | 26.2368 | 0.3680 |
0.0497 | 3.4722 | 1000 | 0.3388 | 26.9086 | 0.3885 |
0.028 | 4.1667 | 1200 | 0.3616 | 26.5684 | 0.3693 |
0.0145 | 4.8611 | 1400 | 0.3667 | 26.5596 | 0.3686 |
Framework versions
- Transformers 4.48.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.