nkkbr's picture
End of training
c9342a3 verified
metadata
library_name: transformers
language:
  - ja
license: apache-2.0
base_model: openai/whisper-large-v3
tags:
  - generated_from_trainer
datasets:
  - nkkbr/NG_word_detect
metrics:
  - wer
model-index:
  - name: NG_word_detect
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: NG_word_detect
          type: nkkbr/NG_word_detect
          args: NG_word_detect
        metrics:
          - name: Wer
            type: wer
            value: 43.02848575712144

NG_word_detect

This model is a fine-tuned version of openai/whisper-large-v3 on the NG_word_detect dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1601
  • Wer: 43.0285

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • training_steps: 500
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.2387 0.1524 25 0.2262 59.5952
0.2003 0.3049 50 0.1823 50.6747
0.1797 0.4573 75 0.1787 51.6492
0.2083 0.6098 100 0.1732 49.1004
0.1798 0.7622 125 0.1681 46.9265
0.136 0.9146 150 0.1684 48.6507
0.0572 1.0671 175 0.1701 47.9760
0.0533 1.2195 200 0.1600 45.6522
0.0735 1.3720 225 0.1644 46.4018
0.0731 1.5244 250 0.1582 45.8771
0.0734 1.6768 275 0.1583 44.6777
0.0714 1.8293 300 0.1552 44.1529
0.0663 1.9817 325 0.1511 44.3778
0.0389 2.1341 350 0.1561 42.8786
0.0143 2.2866 375 0.1618 43.7031
0.0215 2.4390 400 0.1624 43.2534
0.0203 2.5915 425 0.1591 43.1784
0.0309 2.7439 450 0.1617 43.3283
0.0138 2.8963 475 0.1612 43.1034
0.0065 3.0488 500 0.1601 43.0285

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.0
  • Tokenizers 0.19.1