Edit model card

wav2vec2_base_vietnamese_control_dataset

This model is a fine-tuned version of nguyenvulebinh/wav2vec2-base-vi on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0491
  • Wer: 0.2007

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Wer
19.8101 2.05 500 17.9020 1.0
12.7254 4.1 1000 13.4585 1.0
8.6927 6.15 1500 8.4217 1.0
5.6269 8.2 2000 5.1617 1.0
3.9616 10.25 2500 3.6167 1.0
3.2881 12.3 3000 3.2056 1.0
3.1381 14.34 3500 3.1174 1.0
3.092 16.39 4000 3.1044 1.0
3.0222 18.44 4500 2.9461 1.0
2.8394 20.49 5000 2.7133 1.0
2.6096 22.54 5500 2.3871 1.0
2.3244 24.59 6000 2.0311 1.0
1.9785 26.64 6500 1.6117 1.0
1.6346 28.69 7000 1.2173 1.0
1.3122 30.74 7500 0.9547 1.0
1.0927 32.79 8000 0.7738 1.0
0.9261 34.84 8500 0.6172 0.8970
0.7336 36.89 9000 0.4357 0.3654
0.5754 38.93 9500 0.3304 0.3071
0.4791 40.98 10000 0.2668 0.2785
0.4212 43.03 10500 0.2240 0.2548
0.3439 45.08 11000 0.1852 0.2329
0.3048 47.13 11500 0.1607 0.2119
0.2684 49.18 12000 0.1376 0.2105
0.2298 51.23 12500 0.1227 0.2071
0.2192 53.28 13000 0.1092 0.2055
0.2063 55.33 13500 0.0990 0.2039
0.1875 57.38 14000 0.0895 0.2039
0.1692 59.43 14500 0.0822 0.2039
0.1421 61.48 15000 0.0766 0.2029
0.1505 63.52 15500 0.0710 0.2031
0.1796 65.57 16000 0.0682 0.2019
0.1265 67.62 16500 0.0641 0.2015
0.1172 69.67 17000 0.0617 0.2019
0.1173 71.72 17500 0.0586 0.2011
0.1226 73.77 18000 0.0568 0.2015
0.1165 75.82 18500 0.0567 0.2011
0.1098 77.87 19000 0.0547 0.2007
0.0996 79.92 19500 0.0537 0.2009
0.1024 81.97 20000 0.0521 0.2009
0.0992 84.02 20500 0.0508 0.2009
0.1008 86.07 21000 0.0510 0.2009
0.1147 88.11 21500 0.0501 0.2007
0.1138 90.16 22000 0.0500 0.2005
0.0939 92.21 22500 0.0493 0.2007
0.1021 94.26 23000 0.0492 0.2005
0.1009 96.31 23500 0.0488 0.2009
0.0935 98.36 24000 0.0491 0.2007

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
9
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for tuanmanh28/wav2vec2_base_vietnamese_control_dataset

Finetuned
(10)
this model