Edit model card

he-cantillation

This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2807
  • Wer: 16.7878
  • Avg Precision Exact: 0.8378
  • Avg Recall Exact: 0.8397
  • Avg F1 Exact: 0.8382
  • Avg Precision Letter Shift: 0.8599
  • Avg Recall Letter Shift: 0.8619
  • Avg F1 Letter Shift: 0.8604
  • Avg Precision Word Level: 0.8648
  • Avg Recall Word Level: 0.8671
  • Avg F1 Word Level: 0.8655
  • Avg Precision Word Shift: 0.9472
  • Avg Recall Word Shift: 0.9510
  • Avg F1 Word Shift: 0.9485
  • Precision Median Exact: 0.9231
  • Recall Median Exact: 0.9231
  • F1 Median Exact: 0.9286
  • Precision Max Exact: 1.0
  • Recall Max Exact: 1.0
  • F1 Max Exact: 1.0
  • Precision Min Exact: 0.0
  • Recall Min Exact: 0.0
  • F1 Min Exact: 0.0
  • Precision Min Letter Shift: 0.0
  • Recall Min Letter Shift: 0.0
  • F1 Min Letter Shift: 0.0
  • Precision Min Word Level: 0.0
  • Recall Min Word Level: 0.0
  • F1 Min Word Level: 0.0
  • Precision Min Word Shift: 0.0833
  • Recall Min Word Shift: 0.0769
  • F1 Min Word Shift: 0.08

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • training_steps: 500000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Avg Precision Exact Avg Recall Exact Avg F1 Exact Avg Precision Letter Shift Avg Recall Letter Shift Avg F1 Letter Shift Avg Precision Word Level Avg Recall Word Level Avg F1 Word Level Avg Precision Word Shift Avg Recall Word Shift Avg F1 Word Shift Precision Median Exact Recall Median Exact F1 Median Exact Precision Max Exact Recall Max Exact F1 Max Exact Precision Min Exact Recall Min Exact F1 Min Exact Precision Min Letter Shift Recall Min Letter Shift F1 Min Letter Shift Precision Min Word Level Recall Min Word Level F1 Min Word Level Precision Min Word Shift Recall Min Word Shift F1 Min Word Shift
No log 0.0001 1 9.1276 100.0 0.0016 0.0166 0.0028 0.0603 0.0627 0.0613 0.0150 0.1274 0.0263 0.3304 0.3453 0.3374 0.0 0.0 0.0 0.125 1.0 0.2222 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1478 0.5167 10000 0.2082 31.8890 0.6951 0.7013 0.6974 0.7292 0.7359 0.7317 0.7377 0.7445 0.7402 0.8718 0.8824 0.8760 0.8 0.8 0.8 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1 0.1 0.1000
0.0557 1.0334 20000 0.1807 26.1682 0.7472 0.7539 0.7499 0.7761 0.7831 0.7789 0.7827 0.7903 0.7858 0.8946 0.9045 0.8986 0.8462 0.8571 0.8571 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1538 0.1667 0.16
0.0339 1.5501 30000 0.1813 24.2204 0.7732 0.7738 0.7729 0.8018 0.8024 0.8014 0.8076 0.8084 0.8073 0.9113 0.9142 0.9118 0.875 0.875 0.875 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0769 0.0909 0.0833
0.0206 2.0668 40000 0.1838 22.7068 0.7804 0.7837 0.7814 0.8073 0.8108 0.8084 0.8133 0.8167 0.8144 0.9158 0.9216 0.9179 0.8889 0.9 0.8889 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0136 2.5834 50000 0.1874 21.5520 0.7908 0.7923 0.7909 0.8179 0.8195 0.8180 0.8230 0.8249 0.8233 0.9247 0.9283 0.9257 0.9 0.9 0.8889 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0296 3.1001 60000 0.1920 21.2247 0.7969 0.7999 0.7978 0.8213 0.8245 0.8223 0.8270 0.8307 0.8282 0.9244 0.9300 0.9264 0.9 0.9 0.9 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0147 3.6168 70000 0.2004 21.0233 0.7936 0.7975 0.7949 0.8210 0.8253 0.8224 0.8270 0.8318 0.8287 0.9278 0.9353 0.9306 0.9 0.9091 0.9 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.1111 0.125
0.0056 4.1335 80000 0.2070 20.3436 0.8062 0.8047 0.8049 0.8311 0.8296 0.8298 0.8372 0.8362 0.8361 0.9321 0.9335 0.9320 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.1111 0.125
0.0059 4.6502 90000 0.2076 20.4726 0.8049 0.8078 0.8058 0.8309 0.8340 0.8319 0.8365 0.8403 0.8378 0.9279 0.9344 0.9304 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0049 5.1669 100000 0.2229 20.5198 0.8003 0.8055 0.8024 0.8256 0.8312 0.8278 0.8314 0.8370 0.8336 0.9251 0.9333 0.9284 0.9091 0.9091 0.9 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0065 5.6836 110000 0.2206 19.8999 0.8084 0.8116 0.8095 0.8332 0.8368 0.8344 0.8390 0.8429 0.8404 0.9307 0.9369 0.9331 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0024 6.2003 120000 0.2204 19.3430 0.8114 0.8150 0.8126 0.8360 0.8398 0.8373 0.8414 0.8454 0.8428 0.9331 0.9386 0.9351 0.9091 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0046 6.7170 130000 0.2275 19.3335 0.8185 0.8199 0.8186 0.8430 0.8446 0.8432 0.8485 0.8505 0.8489 0.9337 0.9362 0.9342 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0909 0.0833 0.0870
0.0036 7.2336 140000 0.2297 19.3146 0.8115 0.8142 0.8123 0.8358 0.8390 0.8368 0.8408 0.8441 0.8418 0.9331 0.9375 0.9346 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0909 0.0714 0.0833
0.0028 7.7503 150000 0.2331 19.4846 0.8112 0.8118 0.8110 0.8365 0.8372 0.8363 0.8416 0.8427 0.8416 0.9333 0.9363 0.9340 0.9091 0.9091 0.9091 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.003 8.2670 160000 0.2376 19.0975 0.8109 0.8132 0.8116 0.8345 0.8370 0.8353 0.8403 0.8429 0.8411 0.9360 0.9402 0.9374 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0038 8.7837 170000 0.2412 19.0692 0.8147 0.8163 0.8150 0.8395 0.8413 0.8398 0.8453 0.8473 0.8457 0.9358 0.9399 0.9371 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0013 9.3004 180000 0.2444 19.0063 0.8123 0.8150 0.8131 0.8363 0.8392 0.8372 0.8420 0.8449 0.8429 0.9355 0.9406 0.9374 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0028 9.8171 190000 0.2474 18.9842 0.8139 0.8174 0.8151 0.8381 0.8419 0.8394 0.8437 0.8477 0.8451 0.9352 0.9400 0.9369 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0029 10.3338 200000 0.2464 18.6003 0.8203 0.8239 0.8216 0.8449 0.8487 0.8462 0.8504 0.8542 0.8517 0.9384 0.9440 0.9405 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0008 10.8505 210000 0.2504 18.6098 0.8180 0.8207 0.8188 0.8413 0.8442 0.8422 0.8463 0.8493 0.8472 0.9376 0.9420 0.9391 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0909 0.1 0.0952
0.0021 11.3672 220000 0.2585 18.8363 0.8196 0.8222 0.8204 0.8438 0.8467 0.8447 0.8490 0.8520 0.8499 0.9363 0.9412 0.9381 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0036 11.8838 230000 0.2558 18.6916 0.8156 0.8185 0.8165 0.8397 0.8429 0.8407 0.8455 0.8491 0.8467 0.9369 0.9428 0.9392 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0007 12.4005 240000 0.2598 18.3077 0.8177 0.8195 0.8181 0.8416 0.8436 0.8421 0.8469 0.8494 0.8476 0.9403 0.9449 0.9419 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0009 12.9172 250000 0.2562 18.3549 0.8226 0.8247 0.8232 0.8464 0.8487 0.8471 0.8520 0.8547 0.8528 0.9408 0.9446 0.9421 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0005 13.4339 260000 0.2605 18.1598 0.8248 0.8272 0.8255 0.8488 0.8515 0.8497 0.8538 0.8564 0.8546 0.9405 0.9456 0.9424 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0769 0.0909 0.0833
0.0015 13.9506 270000 0.2670 18.4650 0.8204 0.8244 0.8219 0.8445 0.8488 0.8461 0.8494 0.8542 0.8512 0.9368 0.9441 0.9397 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0833 0.0833
0.0002 14.4673 280000 0.2701 19.0094 0.8169 0.8179 0.8168 0.8410 0.8421 0.8409 0.8464 0.8479 0.8466 0.9342 0.9377 0.9353 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.1 0.0909
0.0002 14.9840 290000 0.2699 18.2070 0.8186 0.8239 0.8207 0.8419 0.8475 0.8441 0.8465 0.8523 0.8488 0.9382 0.9456 0.9411 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0003 15.5007 300000 0.2696 18.4021 0.8250 0.8248 0.8244 0.8471 0.8470 0.8465 0.8522 0.8528 0.8519 0.9380 0.9412 0.9389 0.9167 0.9167 0.9167 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0008 16.0174 310000 0.2645 17.6406 0.8288 0.8314 0.8296 0.8518 0.8546 0.8527 0.8572 0.8602 0.8581 0.9453 0.9498 0.9470 0.9167 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0002 16.5340 320000 0.2712 17.8923 0.8257 0.8290 0.8269 0.8494 0.8529 0.8506 0.8545 0.8581 0.8558 0.9403 0.9458 0.9424 0.9167 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0833 0.0833
0.0001 17.0507 330000 0.2667 17.9710 0.8283 0.8308 0.8291 0.8529 0.8557 0.8538 0.8581 0.8609 0.8590 0.9410 0.9452 0.9425 0.9167 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0001 17.5674 340000 0.2680 17.7130 0.8289 0.8314 0.8296 0.8525 0.8551 0.8532 0.8572 0.8603 0.8582 0.9415 0.9462 0.9432 0.9167 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0 18.0841 350000 0.2657 17.5116 0.8328 0.8346 0.8332 0.8559 0.8579 0.8564 0.8607 0.8630 0.8613 0.9429 0.9462 0.9439 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0008 18.6008 360000 0.2754 17.7161 0.8339 0.8363 0.8345 0.8568 0.8594 0.8575 0.8620 0.8651 0.8630 0.9425 0.9468 0.9440 0.9167 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0001 19.1175 370000 0.2722 17.3007 0.8349 0.8369 0.8354 0.8575 0.8597 0.8581 0.8629 0.8651 0.8635 0.9443 0.9482 0.9456 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.0001 19.6342 380000 0.2751 17.2535 0.8354 0.8383 0.8363 0.8587 0.8618 0.8597 0.8638 0.8670 0.8648 0.9462 0.9506 0.9478 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.1429 0.125 0.1333
0.0 20.1509 390000 0.2743 17.3574 0.8317 0.8351 0.8329 0.8552 0.8586 0.8564 0.8601 0.8638 0.8613 0.9449 0.9500 0.9468 0.9167 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0 20.6676 400000 0.2794 17.5367 0.8343 0.8347 0.8340 0.8574 0.8579 0.8571 0.8624 0.8632 0.8622 0.9442 0.9465 0.9447 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0833 0.0833
0.0 21.1843 410000 0.2782 17.1717 0.8365 0.8367 0.8361 0.8591 0.8593 0.8586 0.8640 0.8646 0.8637 0.9457 0.9481 0.9463 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0833 0.0833
0.0 21.7009 420000 0.2778 17.1308 0.8358 0.8379 0.8363 0.8590 0.8612 0.8596 0.8641 0.8664 0.8647 0.9451 0.9491 0.9465 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0 22.2176 430000 0.2785 17.0553 0.8366 0.8386 0.8371 0.8590 0.8612 0.8596 0.8640 0.8663 0.8646 0.9456 0.9493 0.9469 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0 22.7343 440000 0.2808 17.0679 0.8365 0.8380 0.8368 0.8592 0.8608 0.8595 0.8639 0.8659 0.8644 0.9457 0.9498 0.9472 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0 23.2510 450000 0.2794 16.8885 0.8352 0.8375 0.8359 0.8572 0.8596 0.8579 0.8622 0.8648 0.8630 0.9470 0.9512 0.9485 0.9231 0.9231 0.9231 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0 23.7677 460000 0.2795 16.9514 0.8374 0.8384 0.8374 0.8593 0.8604 0.8593 0.8645 0.8657 0.8646 0.9470 0.9498 0.9478 0.9231 0.9231 0.9286 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0 24.2844 470000 0.2806 16.9452 0.8385 0.8403 0.8390 0.8609 0.8628 0.8613 0.8657 0.8679 0.8663 0.9468 0.9505 0.9481 0.9231 0.9231 0.9286 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0 24.8011 480000 0.2804 16.7784 0.8385 0.8403 0.8389 0.8605 0.8624 0.8610 0.8653 0.8675 0.8659 0.9473 0.9510 0.9486 0.9231 0.9231 0.9286 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0 25.3178 490000 0.2806 16.8098 0.8376 0.8395 0.8381 0.8596 0.8616 0.8601 0.8644 0.8667 0.8651 0.9472 0.9511 0.9486 0.9231 0.9231 0.9286 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08
0.0 25.8345 500000 0.2807 16.7878 0.8378 0.8397 0.8382 0.8599 0.8619 0.8604 0.8648 0.8671 0.8655 0.9472 0.9510 0.9485 0.9231 0.9231 0.9286 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0833 0.0769 0.08

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.2.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
37.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for cantillation/Teamim-tiny_DropOut-0.5_Augmented_Combined-Data_date-06-07-2024_20-19

Finetuned
(1210)
this model
Finetunes
1 model