metadata
library_name: transformers
datasets:
- oza75/bambara-asr
- djelia/bambara-audio-b
language:
- bm
metrics:
- cer
- wer
base_model:
- openai/whisper-large-v2
This method is a finetuned version of openai/whisper-large-v2 using PEFT and Lora. The model achieved a WER score of 21% and a CER of 0.088 on the test split of djelia/bambara-asr