Whisper Small Hi - Sanchit Gandhi

This model is a fine-tuned version of nurzhanit/whisper-enhanced-ml on the Common Voice 11.0 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Wer: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • training_steps: 4000

Training results

Training Loss Epoch Step Validation Loss Wer
0.0 16.6667 100 0.0000 0.0
0.0 33.3333 200 0.0000 0.0
0.0 50.0 300 0.0000 0.0
0.0 66.6667 400 0.0000 0.0
0.0 83.3333 500 0.0000 0.0
0.0 100.0 600 0.0000 0.0
0.0 116.6667 700 0.0000 0.0
0.0 133.3333 800 0.0000 0.0
0.0 150.0 900 0.0000 0.0
0.0 166.6667 1000 0.0000 0.0
0.0 183.3333 1100 0.0000 0.0
0.0 200.0 1200 0.0000 0.0
0.0 216.6667 1300 0.0000 0.0
0.0 233.3333 1400 0.0000 0.0
0.0 250.0 1500 0.0000 0.0
0.0 266.6667 1600 0.0000 0.0
0.0 283.3333 1700 0.0000 0.0
0.0 300.0 1800 0.0000 0.0
0.0 316.6667 1900 0.0000 0.0
0.0 333.3333 2000 0.0000 0.0
0.0 350.0 2100 0.0000 0.0
0.0 366.6667 2200 0.0000 0.0
0.0 383.3333 2300 0.0000 0.0
0.0 400.0 2400 0.0000 0.0
0.0 416.6667 2500 0.0000 0.0
0.0 433.3333 2600 0.0000 0.0
0.0 450.0 2700 0.0000 0.0
0.0 466.6667 2800 0.0000 0.0
0.0 483.3333 2900 0.0000 0.0
0.0 500.0 3000 0.0000 0.0
0.0 516.6667 3100 0.0000 0.0
0.0 533.3333 3200 0.0000 0.0
0.0 550.0 3300 0.0000 0.0
0.0 566.6667 3400 0.0000 0.0
0.0 583.3333 3500 0.0000 0.0
0.0 600.0 3600 0.0000 0.0
0.0 616.6667 3700 0.0000 0.0
0.0 633.3333 3800 0.0000 0.0
0.0 650.0 3900 0.0000 0.0
0.0 666.6667 4000 0.0000 0.0

Framework versions

  • Transformers 4.40.0
  • Pytorch 2.5.0+cu124
  • Datasets 3.0.2
  • Tokenizers 0.19.1
Downloads last month
64,441
Safetensors
Model size
242M params
Tensor type
F32
·
Inference API
or

Model tree for nurzhanit/whisper-enhanced-ml

Unable to build the model tree, the base model loops to the model itself. Learn more.

Dataset used to train nurzhanit/whisper-enhanced-ml

Evaluation results