mms-1b-bigcgen-male-10hrs-model

This model is a fine-tuned version of facebook/mms-1b-all on the BIGCGEN - BEM dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4413
  • Wer: 0.4554

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
12.1631 0.1548 100 1.2139 0.8800
1.7593 0.3096 200 0.6127 0.5689
1.6361 0.4644 300 0.5921 0.5732
1.6146 0.6192 400 0.5587 0.5547
1.3783 0.7740 500 0.5520 0.5314
1.36 0.9288 600 0.5444 0.5278
1.3447 1.0836 700 0.5394 0.5126
1.3265 1.2384 800 0.5085 0.5028
1.2625 1.3932 900 0.4822 0.5008
1.2793 1.5480 1000 0.5092 0.5037
1.266 1.7028 1100 0.4713 0.4958
1.2451 1.8576 1200 0.4544 0.4780
1.3066 2.0124 1300 0.4491 0.4737
1.2102 2.1672 1400 0.4510 0.4785
1.2384 2.3220 1500 0.4534 0.4756
1.2143 2.4768 1600 0.4538 0.4734
1.0998 2.6316 1700 0.4472 0.4684
1.0608 2.7864 1800 0.4533 0.4616
1.1756 2.9412 1900 0.4384 0.4614
1.0873 3.0960 2000 0.4458 0.4638
1.0788 3.2508 2100 0.4400 0.4573
1.1188 3.4056 2200 0.4412 0.4650
1.2589 3.5604 2300 0.4413 0.4554

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
29
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for csikasote/mms-1b-bigcgen-male-10hrs-model

Finetuned
(244)
this model