wav2vec2-1b-E50_freq_pause_speed

This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9857
  • Cer: 25.4112

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
19.722 0.2580 200 10.6272 92.3226
9.1071 0.5160 400 4.8280 97.6034
4.6864 0.7741 600 5.1023 93.8381
4.894 1.0321 800 4.4428 92.8983
4.5268 1.2901 1000 4.4166 92.9864
4.4975 1.5481 1200 4.3971 92.8160
4.3905 1.8062 1400 4.2999 92.3931
4.0944 2.0642 1600 4.0658 89.2505
3.4243 2.3222 1800 3.5627 74.0073
2.5159 2.5802 2000 2.6364 54.6346
1.9341 2.8383 2200 2.1328 45.0070
1.4936 3.0963 2400 1.7956 41.7939
1.1786 3.3543 2600 1.4282 34.3163
1.0121 3.6123 2800 1.2753 30.2162
0.9051 3.8703 3000 1.2181 30.0164
0.7404 4.1284 3200 1.1032 29.0061
0.6383 4.3864 3400 1.0560 24.8884
0.609 4.6444 3600 0.9713 24.7063
0.5477 4.9024 3800 0.9857 25.4112

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.3.1.post100
  • Datasets 2.19.1
  • Tokenizers 0.20.1
Downloads last month
2
Safetensors
Model size
964M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Gummybear05/wav2vec2-1b-E50_freq_pause_speed

Finetuned
(105)
this model