--- tags: - automatic-speech-recognition - dna_r9.4.1 - generated_from_trainer model-index: - name: wav2vec2-tiny-demo results: [] --- # wav2vec2-tiny-demo This model is a fine-tuned version of [yenpolin/wav2vec2-tiny](https://huggingface.co/yenpolin/wav2vec2-tiny) on the DNA_R9.4.1 - NA dataset. It achieves the following results on the evaluation set: - Loss: 1.3174 - Mean Acc: 41.4052 - Median Acc: 54.7558 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 400 - eval_batch_size: 800 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 100.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Acc | Median Acc | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:| | No log | 1.0 | 250 | 1.2708 | 0.0 | 0.0 | | 1.2586 | 2.0 | 500 | 1.2401 | 0.0 | 0.0 | | 1.2586 | 3.0 | 750 | 1.2212 | 0.0 | 0.0 | | 1.1999 | 4.0 | 1000 | 1.1985 | 0.0026 | 0.0 | | 1.1999 | 5.0 | 1250 | 1.1824 | 0.0122 | 0.0 | | 1.1635 | 6.0 | 1500 | 1.1716 | 0.0301 | 0.0 | | 1.1635 | 7.0 | 1750 | 1.1562 | 0.0296 | 0.0 | | 1.1361 | 8.0 | 2000 | 1.1507 | 0.0088 | 0.0 | | 1.1361 | 9.0 | 2250 | 1.1393 | 0.0649 | 0.0 | | 1.1142 | 10.0 | 2500 | 1.1311 | 0.0150 | 0.0 | | 1.1142 | 11.0 | 2750 | 1.0849 | 0.0534 | 0.0 | | 1.0648 | 12.0 | 3000 | 1.0644 | 0.0912 | 0.0 | | 1.0648 | 13.0 | 3250 | 1.0415 | 0.1348 | 0.0 | | 1.0185 | 14.0 | 3500 | 1.0390 | 0.0443 | 0.0 | | 1.0185 | 15.0 | 3750 | 1.0214 | 0.1120 | 0.0 | | 0.9951 | 16.0 | 4000 | 1.0152 | 0.1647 | 0.0 | | 0.9951 | 17.0 | 4250 | 1.0193 | 0.1194 | 0.0 | | 0.9813 | 18.0 | 4500 | 1.0092 | 0.1182 | 0.0 | | 0.9813 | 19.0 | 4750 | 1.0165 | 0.0672 | 0.0 | | 0.9625 | 20.0 | 5000 | 0.9430 | 9.0082 | 0.0 | | 0.9625 | 21.0 | 5250 | 0.9301 | 13.2452 | 0.0 | | 0.8958 | 22.0 | 5500 | 0.9261 | 8.9878 | 0.0 | | 0.8958 | 23.0 | 5750 | 0.9174 | 15.3641 | 0.0 | | 0.8756 | 24.0 | 6000 | 0.9116 | 14.9011 | 0.0 | | 0.8756 | 25.0 | 6250 | 0.9130 | 12.2681 | 0.0 | | 0.8607 | 26.0 | 6500 | 0.9114 | 15.2874 | 0.0 | | 0.8607 | 27.0 | 6750 | 0.9105 | 24.1205 | 0.0 | | 0.8482 | 28.0 | 7000 | 0.9083 | 18.6607 | 0.0 | | 0.8482 | 29.0 | 7250 | 0.9196 | 17.7246 | 0.0 | | 0.8359 | 30.0 | 7500 | 0.9149 | 19.7733 | 0.0 | | 0.8359 | 31.0 | 7750 | 0.9133 | 18.6297 | 0.0 | | 0.8232 | 32.0 | 8000 | 0.9479 | 12.2703 | 0.0 | | 0.8232 | 33.0 | 8250 | 0.9303 | 19.9043 | 0.0 | | 0.8092 | 34.0 | 8500 | 0.9300 | 22.3510 | 0.0 | | 0.8092 | 35.0 | 8750 | 0.9295 | 27.8118 | 0.0 | | 0.7951 | 36.0 | 9000 | 0.9439 | 23.2963 | 0.0 | | 0.7951 | 37.0 | 9250 | 0.9620 | 20.9074 | 0.0 | | 0.7803 | 38.0 | 9500 | 0.9571 | 28.7828 | 52.3026 | | 0.7803 | 39.0 | 9750 | 0.9814 | 25.2679 | 0.0 | | 0.7669 | 40.0 | 10000 | 0.9787 | 31.2820 | 53.3865 | | 0.7669 | 41.0 | 10250 | 0.9765 | 28.6333 | 52.2034 | | 0.7529 | 42.0 | 10500 | 1.0038 | 27.5154 | 0.0 | | 0.7529 | 43.0 | 10750 | 1.0338 | 28.4940 | 52.1368 | | 0.7411 | 44.0 | 11000 | 1.0279 | 28.7206 | 52.2782 | | 0.7411 | 45.0 | 11250 | 1.0177 | 29.0583 | 52.4116 | | 0.7299 | 46.0 | 11500 | 1.0148 | 32.9230 | 53.6 | | 0.7299 | 47.0 | 11750 | 1.0400 | 33.5174 | 53.8136 | | 0.7198 | 48.0 | 12000 | 1.0478 | 30.9309 | 53.2544 | | 0.7198 | 49.0 | 12250 | 1.0538 | 29.2445 | 52.5692 | | 0.7109 | 50.0 | 12500 | 1.0524 | 27.2312 | 0.0 | | 0.7109 | 51.0 | 12750 | 1.0862 | 33.3504 | 53.7234 | | 0.7036 | 52.0 | 13000 | 1.0744 | 31.8779 | 53.3679 | | 0.7036 | 53.0 | 13250 | 1.0628 | 28.3728 | 51.7123 | | 0.6963 | 54.0 | 13500 | 1.0587 | 30.9822 | 53.125 | | 0.6963 | 55.0 | 13750 | 1.0834 | 33.2130 | 53.5714 | | 0.69 | 56.0 | 14000 | 1.1077 | 36.3846 | 54.2857 | | 0.69 | 57.0 | 14250 | 1.1150 | 32.3586 | 53.4954 | | 0.6855 | 58.0 | 14500 | 1.1352 | 36.9014 | 54.4061 | | 0.6855 | 59.0 | 14750 | 1.1557 | 33.8871 | 53.8462 | | 0.6811 | 60.0 | 15000 | 1.1315 | 33.5959 | 53.7736 | | 0.6811 | 61.0 | 15250 | 1.0957 | 32.9908 | 53.5714 | | 0.6768 | 62.0 | 15500 | 1.1236 | 32.7653 | 53.6122 | | 0.6768 | 63.0 | 15750 | 1.1153 | 34.8283 | 53.9432 | | 0.6722 | 64.0 | 16000 | 1.1300 | 35.0110 | 53.9683 | | 0.6722 | 65.0 | 16250 | 1.1826 | 35.9229 | 54.1463 | | 0.6682 | 66.0 | 16500 | 1.1534 | 38.5511 | 54.4850 | | 0.6682 | 67.0 | 16750 | 1.1636 | 35.8289 | 54.0984 | | 0.6653 | 68.0 | 17000 | 1.1404 | 34.6543 | 53.8462 | | 0.6653 | 69.0 | 17250 | 1.1473 | 36.2493 | 54.1209 | | 0.6624 | 70.0 | 17500 | 1.1532 | 39.5140 | 54.5775 | | 0.6624 | 71.0 | 17750 | 1.1715 | 36.2354 | 54.1254 | | 0.6597 | 72.0 | 18000 | 1.1875 | 35.5023 | 54.0856 | | 0.6597 | 73.0 | 18250 | 1.1643 | 34.5047 | 53.9024 | | 0.657 | 74.0 | 18500 | 1.1894 | 38.7561 | 54.4928 | | 0.657 | 75.0 | 18750 | 1.2082 | 38.2157 | 54.4601 | | 0.6543 | 76.0 | 19000 | 1.1843 | 34.2857 | 53.8835 | | 0.6543 | 77.0 | 19250 | 1.1689 | 38.4264 | 54.4355 | | 0.652 | 78.0 | 19500 | 1.2085 | 37.7495 | 54.3750 | | 0.652 | 79.0 | 19750 | 1.1985 | 39.1444 | 54.5190 | | 0.6497 | 80.0 | 20000 | 1.2331 | 40.0190 | 54.6125 | | 0.6497 | 81.0 | 20250 | 1.2403 | 39.0511 | 54.5205 | | 0.6476 | 82.0 | 20500 | 1.1910 | 37.2936 | 54.2453 | | 0.6476 | 83.0 | 20750 | 1.2035 | 41.2572 | 54.7244 | | 0.6457 | 84.0 | 21000 | 1.2124 | 38.7565 | 54.4892 | | 0.6457 | 85.0 | 21250 | 1.2327 | 39.1286 | 54.4959 | | 0.6437 | 86.0 | 21500 | 1.2252 | 39.8797 | 54.5775 | | 0.6437 | 87.0 | 21750 | 1.2346 | 38.1128 | 54.3860 | | 0.642 | 88.0 | 22000 | 1.2441 | 40.4358 | 54.6512 | | 0.642 | 89.0 | 22250 | 1.2488 | 39.9912 | 54.5977 | | 0.6403 | 90.0 | 22500 | 1.2483 | 39.9116 | 54.6053 | | 0.6403 | 91.0 | 22750 | 1.2674 | 40.9458 | 54.7009 | | 0.6387 | 92.0 | 23000 | 1.2694 | 40.4423 | 54.6358 | | 0.6387 | 93.0 | 23250 | 1.2717 | 40.7397 | 54.6763 | | 0.6371 | 94.0 | 23500 | 1.2820 | 40.5719 | 54.6392 | | 0.6371 | 95.0 | 23750 | 1.2883 | 40.3534 | 54.6667 | | 0.6358 | 96.0 | 24000 | 1.3039 | 40.6046 | 54.6667 | | 0.6358 | 97.0 | 24250 | 1.3068 | 41.4525 | 54.7739 | | 0.6347 | 98.0 | 24500 | 1.3132 | 41.4247 | 54.7541 | | 0.6347 | 99.0 | 24750 | 1.3124 | 41.2563 | 54.7445 | | 0.6339 | 100.0 | 25000 | 1.3174 | 41.4052 | 54.7558 | ### Framework versions - Transformers 4.26.0 - Pytorch 1.12.1 - Datasets 2.9.0 - Tokenizers 0.13.2