Whisper tiny AR - BH

This model is a fine-tuned version of openai/whisper-tiny on the quran-ayat-speech-to-text dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0150
  • Wer: 13.4379
  • Cer: 4.1186

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 19
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Cer Validation Loss Wer
0.0258 0.1408 400 52.2218 0.0246 104.9348
0.0177 0.2817 800 10.2633 0.0184 26.2089
0.0116 0.4225 1200 7.3210 0.0160 20.9517
0.0101 0.5633 1600 5.8227 0.0141 17.5020
0.008 0.7042 2000 5.1235 0.0127 16.3695
0.0057 0.8450 2400 4.8168 0.0119 15.2343
0.0056 0.9858 2800 4.6678 0.0116 14.6364
0.0071 1.1267 3200 5.3042 0.0135 15.8929
0.0059 1.2676 3600 5.0437 0.0132 15.7165
0.0056 1.4084 4000 5.3648 0.0124 14.5758
0.0041 1.5492 4400 4.7531 0.0122 14.2259
0.0038 1.6901 4800 4.7431 0.0120 13.8043
0.004 1.8309 5200 4.9569 0.0119 14.1818
0.0036 1.9717 5600 4.9171 0.0118 14.0743
0.0033 2.1127 6000 5.0453 0.0129 15.0828
0.0033 2.2535 6400 5.1424 0.0128 14.9340
0.0033 2.3943 6800 5.0171 0.0123 14.7329
0.0033 2.5352 7200 4.3676 0.0122 13.6748
0.0034 2.6760 7600 4.5300 0.0122 13.5618
0.0025 2.8168 8000 4.4698 0.0122 13.3662
0.0028 2.9577 8400 4.5794 0.0122 13.5536
0.003 3.0986 8800 5.0764 0.0125 15.1021
0.0024 3.2394 9200 5.1331 0.0125 14.6943
0.0019 3.3802 9600 5.8448 0.0128 16.2924
0.0023 3.5211 10000 5.1642 0.0128 14.7301
0.002 3.6619 10400 4.9046 0.0127 13.8649
0.0018 3.8027 10800 4.9748 0.0126 13.6610
0.0021 3.9436 11200 5.0136 0.0126 13.8539
0.0018 4.0845 11600 5.0283 0.0132 14.6475
0.0018 4.2253 12000 4.5932 0.0132 13.7988
0.0022 4.3662 12400 4.3948 0.0130 13.7354
0.0025 4.5070 12800 4.7691 0.0131 14.3774
0.0018 4.6478 13200 4.8726 0.0131 14.0854
0.0016 4.7887 13600 4.7136 0.0130 14.0165
0.0018 4.9295 14000 4.7886 0.0130 14.0661
0.0017 5.0704 14400 4.5393 0.0133 14.0110
0.0013 5.2112 14800 4.3028 0.0132 13.7547
0.0017 5.3521 15200 4.5275 0.0133 14.2231
0.0014 5.4929 15600 4.6271 0.0135 14.1983
0.0016 5.6337 16000 4.3983 0.0134 13.8539
0.0015 5.7746 16400 4.2035 0.0134 13.5426
0.0016 5.9154 16800 4.2561 0.0134 13.6335
0.0015 6.0563 17200 4.3246 0.0134 13.6059
0.0015 6.1972 17600 4.1759 0.0137 13.6142
0.0016 6.3380 18000 4.2195 0.0137 13.5536
0.0014 6.4788 18400 4.4176 0.0137 13.8760
0.0015 6.6197 18800 4.2144 0.0137 13.5784
0.0015 6.7605 19200 4.1868 0.0137 13.4874
0.0016 6.9013 19600 4.0946 0.0137 13.3442
0.0015 7.0422 20000 4.1526 0.0139 13.5508
0.0012 7.1831 20400 4.1830 0.0139 13.5040
0.0011 7.3239 20800 4.0708 0.0138 13.3194
0.0017 7.4647 21200 4.0446 0.0138 13.3552
0.0012 7.6056 21600 4.0699 0.0139 13.3194
0.0011 7.7464 22000 4.0378 0.0140 13.3001
0.0012 7.8872 22400 4.0558 0.0139 13.3442
0.0012 8.0282 22800 4.1519 0.0140 13.6225
0.0011 8.1690 23200 4.1673 0.0142 13.4571
0.0013 8.3098 23600 4.1215 0.0141 13.5095
0.001 8.4507 24000 4.0753 0.0142 13.3827
0.0014 8.5915 24400 4.0683 0.0141 13.4985
0.0012 8.7323 24800 4.1103 0.0141 13.4985
0.0014 8.8732 25200 4.0273 0.0141 13.3579
0.0012 9.0141 25600 4.0276 0.0141 13.4075
0.0012 9.1549 26000 4.1824 0.0141 13.4379
0.0011 9.2957 26400 4.1019 0.0142 13.4268
0.0011 9.4366 26800 4.0923 0.0142 13.2946
0.001 9.5774 27200 4.0010 0.0143 13.2477
0.001 9.7182 27600 4.0398 0.0142 13.2560
0.0013 9.8591 28000 4.0109 0.0142 13.2560
0.001 9.9999 28400 4.0093 0.0142 13.2395
0.001 10.1408 28800 4.0721 0.0143 13.4020
0.0013 10.2817 29200 4.0817 0.0144 13.5536
0.0011 10.4225 29600 4.0897 0.0144 13.4902
0.0013 10.5633 30000 4.0567 0.0144 13.3414
0.0008 10.7042 30400 4.0587 0.0144 13.2973
0.0012 10.8450 30800 4.0724 0.0144 13.3249
0.0011 10.9858 31200 4.0590 0.0144 13.3028
0.0013 11.1267 31600 4.0023 0.0144 13.3001
0.0011 11.2676 32000 4.1324 0.0146 13.5894
0.0014 11.4084 32400 4.0923 0.0146 13.4627
0.0009 11.5492 32800 4.0414 0.0146 13.3827
0.0011 11.6901 33200 4.0436 0.0145 13.3717
0.0007 11.8309 33600 4.0622 0.0145 13.4103
0.0012 11.9717 34000 4.0491 0.0145 13.3910
0.001 12.1127 34400 4.1083 0.0145 13.5012
0.0009 12.2535 34800 4.0523 0.0146 13.3221
0.0011 12.3943 35200 4.1317 0.0146 13.4075
0.0009 12.5352 35600 4.0644 0.0147 13.3690
0.0009 12.6760 36000 4.1167 0.0147 13.4323
0.0011 12.8168 36400 4.1032 0.0147 13.4213
0.001 12.9577 36800 4.1064 0.0147 13.4323
0.001 13.0986 37200 4.1417 0.0147 13.5508
0.0012 13.2394 37600 4.1074 0.0147 13.3717
0.0011 13.3802 38000 4.0994 0.0148 13.3827
0.0009 13.5211 38400 4.0821 0.0147 13.4103
0.0012 13.6619 38800 4.0766 0.0148 13.3552
0.0009 13.8027 39200 4.0728 0.0148 13.3276
0.0011 13.9436 39600 4.0744 0.0148 13.3359
0.0007 14.0845 40000 4.0398 0.0147 13.3442
0.001 14.2253 40400 4.0641 0.0147 13.3607
0.001 14.3662 40800 4.1003 0.0148 13.3579
0.0011 14.5070 41200 4.1019 0.0148 13.4847
0.0009 14.6478 41600 4.1170 0.0148 13.4351
0.001 14.7887 42000 4.0750 0.0148 13.4020
0.0012 14.9295 42400 4.1173 0.0148 13.4434
0.0012 15.0704 42800 4.0917 0.0149 13.4985
0.0009 15.2112 43200 4.0958 0.0148 13.3607
0.001 15.3521 43600 4.0314 0.0148 13.3084
0.0009 15.4929 44000 4.0423 0.0148 13.3111
0.0008 15.6337 44400 4.0535 0.0148 13.3331
0.0007 15.7746 44800 4.0619 0.0148 13.3717
0.0009 15.9154 45200 4.0494 0.0148 13.3717
0.0008 16.0563 45600 4.1026 0.0149 13.3965
0.0007 16.1972 46000 4.0885 0.0148 13.3965
0.001 16.3380 46400 4.0712 0.0148 13.3745
0.0009 16.4788 46800 4.0673 0.0149 13.4103
0.0011 16.6197 47200 4.0699 0.0149 13.4654
0.001 16.7605 47600 4.0539 0.0149 13.3993
0.0007 16.9013 48000 4.0660 0.0149 13.3938
0.0012 17.0422 48400 4.0792 0.0149 13.4434
0.0009 17.1831 48800 4.0516 0.0149 13.3883
0.0008 17.3239 49200 4.0141 0.0150 13.3883
0.0011 17.4647 49600 4.0523 0.0149 13.3552
0.0006 17.6056 50000 4.0609 0.0149 13.3552
0.0009 17.7464 50400 4.0827 0.0149 13.4406
0.0007 17.8872 50800 4.0965 0.0149 13.4434
0.0011 18.0282 51200 0.0150 13.4020 4.0696
0.0007 18.1690 51600 0.0150 13.4516 4.0619
0.0008 18.3098 52000 0.0150 13.3276 4.0388
0.0009 18.4507 52400 0.0150 13.3414 4.0439
0.001 18.5915 52800 0.0150 13.4406 4.0930
0.0007 18.7323 53200 0.0150 13.4351 4.1163
0.0008 18.8732 53600 0.0150 13.4379 4.1186

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.5.1+cu121
  • Datasets 3.3.1
  • Tokenizers 0.21.0
Downloads last month
84
Safetensors
Model size
37.8M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for Baselhany/Whisper_tiny_fine_tune_Quran

Finetuned
(1343)
this model