wh_4_sun_syl_w_0_lr_8en5_b32_0025

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.4963
  • Train Accuracy: 0.0312
  • Train Wermet: 0.1340
  • Train Wermet Syl: 0.2298
  • Validation Loss: 1.4423
  • Validation Accuracy: 0.0189
  • Validation Wermet: 0.3788
  • Validation Wermet Syl: 0.3437
  • Epoch: 24

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 8e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0}
  • training_precision: float32

Training results

Train Loss Train Accuracy Train Wermet Train Wermet Syl Validation Loss Validation Accuracy Validation Wermet Validation Wermet Syl Epoch
5.2081 0.0105 1.5151 1.1936 4.0143 0.0113 0.9817 0.9765 0
4.7532 0.0116 0.8756 0.8371 3.9494 0.0114 0.9457 0.9171 1
4.7031 0.0117 0.8615 0.8175 3.9244 0.0114 0.9383 0.8993 2
4.6799 0.0117 0.8569 0.8117 3.9091 0.0114 0.9659 0.9487 3
4.6415 0.0117 0.8450 0.7985 3.8828 0.0115 0.9227 0.8867 4
4.6194 0.0118 0.8366 0.7901 3.8588 0.0115 0.9314 0.8973 5
4.5993 0.0118 0.8253 0.7803 3.9068 0.0116 0.9000 0.8526 6
4.5584 0.0120 0.7864 0.7455 3.7567 0.0118 0.8407 0.7907 7
4.4423 0.0123 0.7655 0.7209 3.5099 0.0123 0.8256 0.7786 8
4.0719 0.0133 0.7479 0.7112 2.9385 0.0135 0.7636 0.7193 9
3.5177 0.0148 0.7049 0.6917 2.4295 0.0148 0.6907 0.6559 10
2.8931 0.0170 0.6532 0.6928 2.0341 0.0161 0.6031 0.5778 11
2.4018 0.0190 0.6050 0.7024 1.7053 0.0174 0.5373 0.5223 12
1.9940 0.0210 0.5579 0.7168 1.5422 0.0180 0.4803 0.4793 13
1.7258 0.0223 0.5356 0.7426 1.4314 0.0186 0.4270 0.4029 14
1.4823 0.0237 0.4865 0.7094 1.3495 0.0190 0.4146 0.4082 15
1.2444 0.0252 0.4278 0.6455 1.2070 0.0198 0.3945 0.3965 16
1.1179 0.0261 0.3774 0.5791 1.1574 0.0200 0.3679 0.3479 17
1.0182 0.0267 0.3377 0.5283 1.3384 0.0189 0.3799 0.3800 18
0.9117 0.0276 0.2949 0.4680 1.2758 0.0192 0.4053 0.4485 19
0.7453 0.0290 0.2613 0.4292 1.1716 0.0203 0.3453 0.3306 20
0.6662 0.0296 0.2193 0.3666 1.1194 0.0205 0.3391 0.3188 21
0.5670 0.0305 0.1841 0.3121 1.2457 0.0200 0.3477 0.3347 22
0.5777 0.0304 0.1711 0.2936 1.2328 0.0201 0.3512 0.3321 23
0.4963 0.0312 0.1340 0.2298 1.4423 0.0189 0.3788 0.3437 24

Framework versions

  • Transformers 4.34.0.dev0
  • TensorFlow 2.13.0
  • Tokenizers 0.13.3
Downloads last month
22
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for bigmorning/wh_4_sun_syl_w_0_lr_8en5_b32_0025

Finetuned
(1256)
this model