Edit model card

wav2vec2-large-xlsr-53-Bhojpuri-Version6

This model is a fine-tuned version of facebook/wav2vec2-large-xlsr-53 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9380
  • Wer: 0.4226

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.5686 2.3108 2000 1.2936 0.6546
1.1325 4.6216 4000 1.0153 0.4864
1.3378 6.9324 6000 0.9690 0.4716
0.9299 9.2432 8000 0.9459 0.4465
0.9488 11.5540 10000 0.9376 0.4407
1.1409 13.8648 12000 0.9411 0.4288
1.3824 16.1756 14000 0.9388 0.4372
0.6866 18.4864 16000 0.9380 0.4226

Framework versions

  • Transformers 4.45.0.dev0
  • Pytorch 2.4.1+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
16
Safetensors
Model size
316M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for khushi1234455687/wav2vec2-large-xlsr-53-Bhojpuri-Version6

Finetuned
this model

Dataset used to train khushi1234455687/wav2vec2-large-xlsr-53-Bhojpuri-Version6