metadata
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-large-xls-r-300m-gn
results: []
wav2vec2-large-xls-r-300m-gn
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.1074
- Wer: 0.8952
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 12
- eval_batch_size: 12
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 96
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
3.4445 | 0.45 | 400 | 0.6193 | 0.6606 |
0.4459 | 0.91 | 800 | 0.3260 | 0.3907 |
0.3307 | 1.36 | 1200 | 0.2739 | 0.3315 |
0.3049 | 1.81 | 1600 | 0.2565 | 0.3027 |
0.2688 | 2.27 | 2000 | 0.2526 | 0.2863 |
0.2589 | 2.72 | 2400 | 0.2426 | 0.2821 |
0.2608 | 3.17 | 2800 | 0.2513 | 0.2965 |
0.2384 | 3.62 | 3200 | 0.2555 | 0.3052 |
0.2504 | 4.08 | 3600 | 0.2462 | 0.2855 |
0.2193 | 4.53 | 4000 | 0.2367 | 0.2691 |
0.2177 | 4.99 | 4400 | 0.2313 | 0.2637 |
0.2029 | 5.44 | 4800 | 0.2344 | 0.2633 |
0.2032 | 5.89 | 5200 | 0.2248 | 0.2553 |
0.1921 | 6.35 | 5600 | 0.2286 | 0.2668 |
0.188 | 6.8 | 6000 | 0.2239 | 0.2550 |
0.1808 | 7.25 | 6400 | 0.2323 | 0.2546 |
0.1791 | 7.71 | 6800 | 0.2285 | 0.2500 |
0.1796 | 8.16 | 7200 | 0.2467 | 0.2653 |
0.3112 | 8.61 | 7600 | 0.3921 | 0.3988 |
0.3545 | 9.07 | 8000 | 0.3703 | 0.3951 |
0.2528 | 9.52 | 8400 | 0.2441 | 0.2764 |
0.1932 | 9.97 | 8800 | 0.2385 | 0.2659 |
0.1688 | 10.42 | 9200 | 0.2245 | 0.2413 |
0.1645 | 10.88 | 9600 | 0.2220 | 0.2396 |
0.1736 | 11.34 | 10000 | 0.2456 | 0.2421 |
0.3031 | 11.79 | 10400 | 0.4347 | 0.2643 |
0.5795 | 12.24 | 10800 | 0.6777 | 0.3291 |
0.7227 | 12.7 | 11200 | 0.6537 | 0.3372 |
0.8282 | 13.15 | 11600 | 1.0544 | 0.8223 |
0.9917 | 13.6 | 12000 | 1.0290 | 0.7793 |
0.9407 | 14.06 | 12400 | 1.0095 | 0.7090 |
1.1154 | 14.51 | 12800 | 1.1014 | 0.8454 |
1.1354 | 14.97 | 13200 | 1.1074 | 0.8952 |
Framework versions
- Transformers 4.29.2
- Pytorch 2.0.1+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3