Edit model card

Visualize in Weights & Biases

bambara-mms-10-hours-oza75bambara-asr-hf

This model is a fine-tuned version of facebook/mms-1b-all on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.1148
  • Wer: 0.5199
  • Cer: 0.2465

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.5938 0.8313 500 1.2774 0.8294 0.3861
1.5142 1.6625 1000 1.2540 0.7729 0.3702
1.3973 2.4938 1500 1.2161 0.7125 0.3521
1.326 3.3250 2000 1.1719 0.7260 0.3426
1.2698 4.1563 2500 1.1710 0.6738 0.3350
1.218 4.9875 3000 1.0692 0.6507 0.3147
1.1672 5.8188 3500 1.0691 0.6408 0.3071
1.1307 6.6500 4000 1.0518 0.6405 0.3035
1.079 7.4813 4500 1.1091 0.6304 0.2972
1.0474 8.3126 5000 1.0662 0.6240 0.2994
1.0169 9.1438 5500 1.0669 0.6173 0.2937
0.9753 9.9751 6000 1.0039 0.6329 0.3027
0.9303 10.8063 6500 0.9909 0.6101 0.2863
0.8867 11.6376 7000 1.0058 0.5929 0.2888
0.8537 12.4688 7500 1.0321 0.6015 0.2859
0.8268 13.3001 8000 1.0427 0.5960 0.2808
0.7908 14.1313 8500 1.0816 0.5847 0.2791
0.7625 14.9626 9000 1.0817 0.5839 0.2748
0.7254 15.7938 9500 1.1185 0.5764 0.2768
0.6817 16.6251 10000 1.1043 0.5658 0.2749
0.6602 17.4564 10500 1.1736 0.5640 0.2708
0.6244 18.2876 11000 1.1894 0.5725 0.2721
0.6073 19.1189 11500 1.2478 0.5647 0.2737
0.5752 19.9501 12000 1.1698 0.5671 0.2770
0.5457 20.7814 12500 1.1864 0.5587 0.2697
0.5231 21.6126 13000 1.1907 0.5592 0.2693
0.5018 22.4439 13500 1.1874 0.5675 0.2707
0.4756 23.2751 14000 1.2167 0.5626 0.2664
0.458 24.1064 14500 1.2149 0.5592 0.2712
0.4301 24.9377 15000 1.3165 0.5499 0.2660
0.4165 25.7689 15500 1.2436 0.5689 0.2713
0.3897 26.6002 16000 1.3646 0.5470 0.2624
0.3743 27.4314 16500 1.4319 0.5477 0.2623
0.355 28.2627 17000 1.4645 0.5580 0.2683
0.3414 29.0939 17500 1.5373 0.5508 0.2590
0.3251 29.9252 18000 1.4997 0.5473 0.2648
0.3049 30.7564 18500 1.5231 0.5494 0.2617
0.2924 31.5877 19000 1.6212 0.5490 0.2610
0.2807 32.4190 19500 1.5959 0.5571 0.2661
0.2669 33.2502 20000 1.6300 0.5478 0.2628
0.2575 34.0815 20500 1.7042 0.5447 0.2616
0.2426 34.9127 21000 1.6750 0.5430 0.2607
0.2301 35.7440 21500 1.7449 0.5421 0.2596
0.2202 36.5752 22000 1.6587 0.5418 0.2584
0.213 37.4065 22500 1.7982 0.5361 0.2546
0.2092 38.2377 23000 1.7748 0.5271 0.2537
0.1956 39.0690 23500 1.8427 0.5310 0.2562
0.1869 39.9002 24000 1.7940 0.5275 0.2541
0.1799 40.7315 24500 1.7794 0.5275 0.2520
0.1727 41.5628 25000 1.9008 0.5374 0.2540
0.1681 42.3940 25500 1.9119 0.5297 0.2522
0.1596 43.2253 26000 1.9836 0.5258 0.2472
0.1569 44.0565 26500 1.9823 0.5195 0.2472
0.1516 44.8878 27000 1.9638 0.5179 0.2483
0.1482 45.7190 27500 2.0763 0.5146 0.2468
0.1379 46.5503 28000 2.0760 0.5234 0.2483
0.1407 47.3815 28500 2.0269 0.5220 0.2482
0.1331 48.2128 29000 2.0818 0.5221 0.2481
0.1328 49.0441 29500 2.0947 0.5205 0.2467
0.1308 49.8753 30000 2.1148 0.5199 0.2465

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.17.0
  • Tokenizers 0.20.3
Downloads last month
18
Safetensors
Model size
965M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/bambara-mms-10-hours-oza75bambara-asr-hf

Finetuned
(131)
this model