mms-1b-bigcgen-combined-15hrs-model

This model is a fine-tuned version of facebook/mms-1b-all on the BIGCGEN - BEM dataset. It achieves the following results on the evaluation set:

  • Loss: inf
  • Wer: 0.5128

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • training_steps: 2500
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
14.7322 0.1019 100 inf 1.0006
6.2607 0.2039 200 inf 1.0337
4.9479 0.3058 300 inf 0.7245
1.9686 0.4077 400 inf 0.6016
1.7287 0.5097 500 inf 0.5793
1.684 0.6116 600 inf 0.5503
1.5857 0.7136 700 inf 0.5663
1.5759 0.8155 800 inf 0.5418
1.5899 0.9174 900 inf 0.5348
1.5646 1.0194 1000 inf 0.5305
1.437 1.1213 1100 inf 0.5277
1.5534 1.2232 1200 inf 0.5238
1.5637 1.3252 1300 inf 0.5481
1.5834 1.4271 1400 inf 0.5293
1.5075 1.5291 1500 inf 0.5228
1.4501 1.6310 1600 inf 0.5218
1.5079 1.7329 1700 inf 0.5257
1.439 1.8349 1800 inf 0.5167
1.4197 1.9368 1900 inf 0.5222
1.5087 2.0387 2000 inf 0.5156
1.5303 2.1407 2100 inf 0.5163
1.4596 2.2426 2200 inf 0.5110
1.3799 2.3445 2300 inf 0.5165
1.4368 2.4465 2400 inf 0.5142
1.3378 2.5484 2500 inf 0.5131

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
35
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for csikasote/mms-1b-bigcgen-combined-15hrs-model

Finetuned
(214)
this model