whisper-enhanced-ml / README.md
nurzhanit's picture
End of training
41ab08d verified
metadata
language:
  - hi
base_model: nurzhanit/whisper-enhanced-ml
tags:
  - hf-asr-leaderboard
  - generated_from_trainer
datasets:
  - mozilla-foundation/common_voice_11_0
metrics:
  - wer
model-index:
  - name: Whisper Small Hi - Sanchit Gandhi
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: Common Voice 11.0
          type: mozilla-foundation/common_voice_11_0
          config: default
          split: None
          args: 'config: hi, split: test'
        metrics:
          - name: Wer
            type: wer
            value: 0

Whisper Small Hi - Sanchit Gandhi

This model is a fine-tuned version of nurzhanit/whisper-enhanced-ml on the Common Voice 11.0 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Wer: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • training_steps: 4000

Training results

Training Loss Epoch Step Validation Loss Wer
0.0 16.6667 100 0.0000 0.0
0.0 33.3333 200 0.0000 0.0
0.0 50.0 300 0.0000 0.0
0.0 66.6667 400 0.0000 0.0
0.0 83.3333 500 0.0000 0.0
0.0 100.0 600 0.0000 0.0
0.0 116.6667 700 0.0000 0.0
0.0 133.3333 800 0.0000 0.0
0.0 150.0 900 0.0000 0.0
0.0 166.6667 1000 0.0000 0.0
0.0 183.3333 1100 0.0000 0.0
0.0 200.0 1200 0.0000 0.0
0.0 216.6667 1300 0.0000 0.0
0.0 233.3333 1400 0.0000 0.0
0.0 250.0 1500 0.0000 0.0
0.0 266.6667 1600 0.0000 0.0
0.0 283.3333 1700 0.0000 0.0
0.0 300.0 1800 0.0000 0.0
0.0 316.6667 1900 0.0000 0.0
0.0 333.3333 2000 0.0000 0.0
0.0 350.0 2100 0.0000 0.0
0.0 366.6667 2200 0.0000 0.0
0.0 383.3333 2300 0.0000 0.0
0.0 400.0 2400 0.0000 0.0
0.0 416.6667 2500 0.0000 0.0
0.0 433.3333 2600 0.0000 0.0
0.0 450.0 2700 0.0000 0.0
0.0 466.6667 2800 0.0000 0.0
0.0 483.3333 2900 0.0000 0.0
0.0 500.0 3000 0.0000 0.0
0.0 516.6667 3100 0.0000 0.0
0.0 533.3333 3200 0.0000 0.0
0.0 550.0 3300 0.0000 0.0
0.0 566.6667 3400 0.0000 0.0
0.0 583.3333 3500 0.0000 0.0
0.0 600.0 3600 0.0000 0.0
0.0 616.6667 3700 0.0000 0.0
0.0 633.3333 3800 0.0000 0.0
0.0 650.0 3900 0.0000 0.0
0.0 666.6667 4000 0.0000 0.0

Framework versions

  • Transformers 4.40.0
  • Pytorch 2.5.0+cu124
  • Datasets 3.0.2
  • Tokenizers 0.19.1