URDU-ASR-25-EPOCH / README.md
Shehryar718's picture
End of training
3f5a4c5
|
raw
history blame
3.49 kB
metadata
base_model: Shehryar718/URDU-ASR
tags:
  - generated_from_trainer
datasets:
  - common_voice_13_0
metrics:
  - wer
model-index:
  - name: URDU-ASR-25-EPOCH
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: common_voice_13_0
          type: common_voice_13_0
          config: ur
          split: test
          args: ur
        metrics:
          - name: Wer
            type: wer
            value: 0.47599520290920344

URDU-ASR-25-EPOCH

This model is a fine-tuned version of Shehryar718/URDU-ASR on the common_voice_13_0 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6782
  • Wer: 0.4760
  • Cer: 0.1986

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 7.5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.99) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 25
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
5.3707 1.0 341 1.3583 0.8266 0.3484
0.4814 2.0 683 0.7213 0.5187 0.2196
0.2821 3.0 1024 0.6354 0.4917 0.2066
0.2368 4.0 1366 0.6730 0.5122 0.2181
0.2105 5.0 1707 0.6430 0.4871 0.2076
0.1965 6.0 2049 0.6397 0.4902 0.2136
0.1879 7.0 2390 0.6397 0.4698 0.1951
0.1743 8.0 2732 0.6636 0.4739 0.1996
0.1632 9.0 3073 0.6752 0.4782 0.1996
0.1572 10.0 3415 0.6859 0.4874 0.2072
0.1586 11.0 3756 0.6761 0.4844 0.2069
0.1595 12.0 4098 0.6846 0.4746 0.1959
0.1534 13.0 4439 0.6750 0.4830 0.2034
0.16 14.0 4781 0.6653 0.4826 0.2038
0.1752 15.0 5122 0.6536 0.4727 0.1946
0.1739 16.0 5464 0.6753 0.4738 0.1912
0.1709 17.0 5805 0.6600 0.4730 0.1996
0.1676 18.0 6147 0.6691 0.4678 0.1919
0.1636 19.0 6488 0.6638 0.4772 0.1990
0.1593 20.0 6830 0.6787 0.4764 0.1976
0.1588 21.0 7171 0.6699 0.4772 0.1974
0.1525 22.0 7513 0.6827 0.4738 0.1962
0.1554 23.0 7854 0.6740 0.4736 0.1970
0.1522 24.0 8196 0.6791 0.4768 0.1989
0.1502 24.96 8525 0.6782 0.4760 0.1986

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.4
  • Tokenizers 0.14.1