ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k1_task8_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8028
  • Qwk: 0.3574
  • Mse: 0.8028
  • Rmse: 0.8960

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.4 2 2.7384 -0.0626 2.7384 1.6548
No log 0.8 4 2.2079 0.0432 2.2079 1.4859
No log 1.2 6 0.8382 -0.0234 0.8382 0.9155
No log 1.6 8 0.6967 0.0750 0.6967 0.8347
No log 2.0 10 0.7143 0.2481 0.7143 0.8452
No log 2.4 12 0.6361 0.3562 0.6361 0.7976
No log 2.8 14 0.6220 0.3690 0.6220 0.7887
No log 3.2 16 0.6678 0.1014 0.6678 0.8172
No log 3.6 18 0.7259 0.0236 0.7259 0.8520
No log 4.0 20 0.7120 0.0236 0.7120 0.8438
No log 4.4 22 0.6702 0.0647 0.6702 0.8187
No log 4.8 24 0.6448 0.1950 0.6448 0.8030
No log 5.2 26 0.6157 0.2498 0.6157 0.7847
No log 5.6 28 0.6134 0.0 0.6134 0.7832
No log 6.0 30 0.6034 0.0750 0.6034 0.7768
No log 6.4 32 0.6015 0.1542 0.6015 0.7756
No log 6.8 34 0.6067 0.1430 0.6067 0.7789
No log 7.2 36 0.6278 0.1830 0.6278 0.7923
No log 7.6 38 0.6540 0.1644 0.6540 0.8087
No log 8.0 40 0.6469 0.2318 0.6469 0.8043
No log 8.4 42 0.6158 0.2641 0.6158 0.7848
No log 8.8 44 0.5949 0.4728 0.5949 0.7713
No log 9.2 46 0.5970 0.4402 0.5970 0.7727
No log 9.6 48 0.5837 0.4859 0.5837 0.7640
No log 10.0 50 0.5997 0.4314 0.5997 0.7744
No log 10.4 52 0.5992 0.4449 0.5992 0.7741
No log 10.8 54 0.6380 0.3763 0.6380 0.7988
No log 11.2 56 0.7728 0.2886 0.7728 0.8791
No log 11.6 58 0.8680 0.1886 0.8680 0.9316
No log 12.0 60 0.7542 0.3451 0.7542 0.8685
No log 12.4 62 0.6916 0.3661 0.6916 0.8316
No log 12.8 64 0.7524 0.4350 0.7524 0.8674
No log 13.2 66 0.7225 0.4493 0.7225 0.8500
No log 13.6 68 0.6810 0.4167 0.6810 0.8252
No log 14.0 70 0.6803 0.3840 0.6803 0.8248
No log 14.4 72 0.7010 0.3972 0.7010 0.8372
No log 14.8 74 0.6966 0.3253 0.6966 0.8346
No log 15.2 76 0.6828 0.4132 0.6828 0.8263
No log 15.6 78 0.6851 0.3849 0.6851 0.8277
No log 16.0 80 0.6722 0.3553 0.6722 0.8199
No log 16.4 82 0.6705 0.2480 0.6705 0.8188
No log 16.8 84 0.7143 0.3115 0.7143 0.8452
No log 17.2 86 0.6808 0.2975 0.6808 0.8251
No log 17.6 88 0.6240 0.3704 0.6240 0.7899
No log 18.0 90 0.6234 0.3585 0.6234 0.7895
No log 18.4 92 0.6919 0.3399 0.6919 0.8318
No log 18.8 94 0.9206 0.3779 0.9206 0.9595
No log 19.2 96 1.0173 0.3775 1.0173 1.0086
No log 19.6 98 0.8991 0.4140 0.8991 0.9482
No log 20.0 100 0.7816 0.3817 0.7816 0.8841
No log 20.4 102 0.7226 0.3887 0.7226 0.8501
No log 20.8 104 0.7370 0.4451 0.7370 0.8585
No log 21.2 106 0.6996 0.4052 0.6996 0.8364
No log 21.6 108 0.6547 0.4126 0.6547 0.8091
No log 22.0 110 0.6714 0.4147 0.6714 0.8194
No log 22.4 112 0.7622 0.4328 0.7622 0.8730
No log 22.8 114 0.8344 0.4205 0.8344 0.9135
No log 23.2 116 0.7719 0.4389 0.7719 0.8786
No log 23.6 118 0.6971 0.4402 0.6971 0.8349
No log 24.0 120 0.6877 0.4402 0.6877 0.8293
No log 24.4 122 0.7600 0.4093 0.7600 0.8718
No log 24.8 124 0.8304 0.4432 0.8304 0.9113
No log 25.2 126 0.8100 0.4261 0.8100 0.9000
No log 25.6 128 0.7086 0.4328 0.7086 0.8418
No log 26.0 130 0.6502 0.4375 0.6502 0.8064
No log 26.4 132 0.6388 0.4161 0.6388 0.7992
No log 26.8 134 0.6906 0.4350 0.6906 0.8310
No log 27.2 136 0.7110 0.3968 0.7110 0.8432
No log 27.6 138 0.7100 0.3968 0.7100 0.8426
No log 28.0 140 0.6444 0.3840 0.6444 0.8028
No log 28.4 142 0.6285 0.4161 0.6285 0.7928
No log 28.8 144 0.6776 0.4052 0.6776 0.8232
No log 29.2 146 0.7381 0.4328 0.7381 0.8591
No log 29.6 148 0.7115 0.4451 0.7115 0.8435
No log 30.0 150 0.6635 0.3780 0.6635 0.8146
No log 30.4 152 0.7103 0.4451 0.7103 0.8428
No log 30.8 154 0.7548 0.4493 0.7548 0.8688
No log 31.2 156 0.7584 0.4587 0.7584 0.8709
No log 31.6 158 0.7039 0.4660 0.7039 0.8390
No log 32.0 160 0.6906 0.4253 0.6906 0.8310
No log 32.4 162 0.6817 0.4217 0.6817 0.8256
No log 32.8 164 0.7035 0.4218 0.7035 0.8387
No log 33.2 166 0.7616 0.4261 0.7616 0.8727
No log 33.6 168 0.7671 0.4261 0.7671 0.8758
No log 34.0 170 0.8001 0.4018 0.8001 0.8945
No log 34.4 172 0.8273 0.3344 0.8273 0.9096
No log 34.8 174 0.7988 0.3243 0.7988 0.8938
No log 35.2 176 0.7820 0.3574 0.7820 0.8843
No log 35.6 178 0.7952 0.3574 0.7952 0.8917
No log 36.0 180 0.8471 0.3271 0.8471 0.9204
No log 36.4 182 0.8404 0.3271 0.8404 0.9167
No log 36.8 184 0.7654 0.4051 0.7654 0.8748
No log 37.2 186 0.7326 0.4142 0.7326 0.8559
No log 37.6 188 0.7587 0.4018 0.7587 0.8710
No log 38.0 190 0.7916 0.3753 0.7916 0.8897
No log 38.4 192 0.8318 0.3451 0.8318 0.9120
No log 38.8 194 0.7966 0.3962 0.7966 0.8925
No log 39.2 196 0.8254 0.3506 0.8254 0.9085
No log 39.6 198 0.8255 0.3506 0.8255 0.9086
No log 40.0 200 0.9013 0.3779 0.9013 0.9493
No log 40.4 202 0.9652 0.3807 0.9652 0.9824
No log 40.8 204 0.8917 0.3676 0.8917 0.9443
No log 41.2 206 0.7486 0.4218 0.7486 0.8652
No log 41.6 208 0.6823 0.4253 0.6823 0.8260
No log 42.0 210 0.6557 0.3933 0.6557 0.8097
No log 42.4 212 0.6600 0.3933 0.6600 0.8124
No log 42.8 214 0.6489 0.3599 0.6489 0.8055
No log 43.2 216 0.6617 0.4161 0.6617 0.8134
No log 43.6 218 0.7421 0.2790 0.7421 0.8614
No log 44.0 220 0.7990 0.3202 0.7990 0.8939
No log 44.4 222 0.7890 0.3202 0.7890 0.8882
No log 44.8 224 0.7393 0.2725 0.7393 0.8598
No log 45.2 226 0.6836 0.3836 0.6836 0.8268
No log 45.6 228 0.6716 0.3808 0.6716 0.8195
No log 46.0 230 0.6836 0.4126 0.6836 0.8268
No log 46.4 232 0.7279 0.3713 0.7279 0.8532
No log 46.8 234 0.8244 0.3109 0.8244 0.9079
No log 47.2 236 1.0042 0.3513 1.0042 1.0021
No log 47.6 238 1.1849 0.3308 1.1849 1.0885
No log 48.0 240 1.2319 0.3457 1.2319 1.1099
No log 48.4 242 1.1411 0.3287 1.1411 1.0682
No log 48.8 244 0.9720 0.3456 0.9720 0.9859
No log 49.2 246 0.8012 0.3873 0.8012 0.8951
No log 49.6 248 0.7048 0.3846 0.7048 0.8395
No log 50.0 250 0.6793 0.3425 0.6793 0.8242
No log 50.4 252 0.6950 0.3558 0.6950 0.8337
No log 50.8 254 0.7471 0.3685 0.7471 0.8644
No log 51.2 256 0.8147 0.3594 0.8147 0.9026
No log 51.6 258 0.8774 0.3733 0.8774 0.9367
No log 52.0 260 0.8907 0.3733 0.8907 0.9438
No log 52.4 262 0.8483 0.3440 0.8483 0.9210
No log 52.8 264 0.8447 0.3440 0.8447 0.9191
No log 53.2 266 0.8219 0.3331 0.8219 0.9066
No log 53.6 268 0.7636 0.3594 0.7636 0.8738
No log 54.0 270 0.7006 0.3651 0.7006 0.8370
No log 54.4 272 0.6818 0.3302 0.6818 0.8257
No log 54.8 274 0.6893 0.3207 0.6893 0.8302
No log 55.2 276 0.7084 0.3651 0.7084 0.8417
No log 55.6 278 0.7490 0.3466 0.7490 0.8655
No log 56.0 280 0.7810 0.3466 0.7810 0.8838
No log 56.4 282 0.8200 0.3466 0.8200 0.9055
No log 56.8 284 0.8566 0.3145 0.8566 0.9255
No log 57.2 286 0.8916 0.3733 0.8916 0.9442
No log 57.6 288 0.9139 0.3899 0.9139 0.9560
No log 58.0 290 0.8703 0.3694 0.8703 0.9329
No log 58.4 292 0.7734 0.3663 0.7734 0.8794
No log 58.8 294 0.7058 0.4315 0.7058 0.8401
No log 59.2 296 0.6901 0.4147 0.6901 0.8307
No log 59.6 298 0.7024 0.4147 0.7024 0.8381
No log 60.0 300 0.7199 0.3817 0.7199 0.8485
No log 60.4 302 0.7495 0.4123 0.7495 0.8658
No log 60.8 304 0.7950 0.3938 0.7950 0.8916
No log 61.2 306 0.8456 0.3988 0.8456 0.9196
No log 61.6 308 0.8443 0.3988 0.8443 0.9188
No log 62.0 310 0.7958 0.3783 0.7958 0.8921
No log 62.4 312 0.7362 0.3243 0.7362 0.8580
No log 62.8 314 0.7054 0.3942 0.7054 0.8399
No log 63.2 316 0.6847 0.4044 0.6847 0.8275
No log 63.6 318 0.6981 0.3942 0.6981 0.8355
No log 64.0 320 0.7406 0.3723 0.7406 0.8606
No log 64.4 322 0.8114 0.3938 0.8114 0.9008
No log 64.8 324 0.8484 0.3783 0.8484 0.9211
No log 65.2 326 0.8344 0.3574 0.8344 0.9134
No log 65.6 328 0.7937 0.3938 0.7937 0.8909
No log 66.0 330 0.7645 0.3938 0.7645 0.8744
No log 66.4 332 0.7133 0.4350 0.7133 0.8446
No log 66.8 334 0.6935 0.4044 0.6935 0.8328
No log 67.2 336 0.6986 0.4044 0.6986 0.8358
No log 67.6 338 0.7359 0.4123 0.7359 0.8579
No log 68.0 340 0.7930 0.3938 0.7930 0.8905
No log 68.4 342 0.8265 0.4051 0.8265 0.9091
No log 68.8 344 0.8264 0.4051 0.8264 0.9091
No log 69.2 346 0.7965 0.3849 0.7965 0.8925
No log 69.6 348 0.7457 0.4030 0.7457 0.8636
No log 70.0 350 0.7224 0.3723 0.7224 0.8499
No log 70.4 352 0.7119 0.3817 0.7119 0.8437
No log 70.8 354 0.7300 0.3723 0.7300 0.8544
No log 71.2 356 0.7590 0.3938 0.7590 0.8712
No log 71.6 358 0.7775 0.3938 0.7775 0.8818
No log 72.0 360 0.7868 0.3938 0.7868 0.8870
No log 72.4 362 0.7930 0.3938 0.7930 0.8905
No log 72.8 364 0.7720 0.4030 0.7720 0.8786
No log 73.2 366 0.7403 0.3630 0.7403 0.8604
No log 73.6 368 0.7193 0.4044 0.7193 0.8481
No log 74.0 370 0.7305 0.4044 0.7305 0.8547
No log 74.4 372 0.7758 0.4030 0.7758 0.8808
No log 74.8 374 0.8169 0.4142 0.8169 0.9038
No log 75.2 376 0.8530 0.4051 0.8530 0.9236
No log 75.6 378 0.8541 0.4051 0.8541 0.9242
No log 76.0 380 0.8427 0.4051 0.8427 0.9180
No log 76.4 382 0.7999 0.3938 0.7999 0.8944
No log 76.8 384 0.7668 0.4123 0.7668 0.8757
No log 77.2 386 0.7323 0.3914 0.7323 0.8557
No log 77.6 388 0.7260 0.3914 0.7260 0.8520
No log 78.0 390 0.7257 0.3914 0.7257 0.8519
No log 78.4 392 0.7397 0.3914 0.7397 0.8600
No log 78.8 394 0.7692 0.3817 0.7692 0.8770
No log 79.2 396 0.7892 0.3754 0.7892 0.8884
No log 79.6 398 0.7908 0.3630 0.7908 0.8893
No log 80.0 400 0.7699 0.3723 0.7699 0.8774
No log 80.4 402 0.7360 0.4147 0.7360 0.8579
No log 80.8 404 0.7189 0.4147 0.7189 0.8479
No log 81.2 406 0.7176 0.4147 0.7176 0.8471
No log 81.6 408 0.7279 0.4147 0.7279 0.8532
No log 82.0 410 0.7477 0.3942 0.7477 0.8647
No log 82.4 412 0.7713 0.3333 0.7713 0.8782
No log 82.8 414 0.8043 0.3155 0.8043 0.8968
No log 83.2 416 0.8194 0.3574 0.8194 0.9052
No log 83.6 418 0.8203 0.3754 0.8203 0.9057
No log 84.0 420 0.8043 0.3846 0.8043 0.8968
No log 84.4 422 0.7749 0.3817 0.7749 0.8803
No log 84.8 424 0.7572 0.4147 0.7572 0.8702
No log 85.2 426 0.7565 0.4147 0.7565 0.8697
No log 85.6 428 0.7634 0.4147 0.7634 0.8737
No log 86.0 430 0.7759 0.3846 0.7759 0.8808
No log 86.4 432 0.7887 0.3938 0.7887 0.8881
No log 86.8 434 0.7947 0.3938 0.7947 0.8914
No log 87.2 436 0.7870 0.3846 0.7870 0.8871
No log 87.6 438 0.7763 0.4067 0.7763 0.8811
No log 88.0 440 0.7723 0.3637 0.7723 0.8788
No log 88.4 442 0.7713 0.3539 0.7713 0.8782
No log 88.8 444 0.7807 0.3539 0.7807 0.8836
No log 89.2 446 0.8004 0.3873 0.8004 0.8947
No log 89.6 448 0.8179 0.3783 0.8179 0.9044
No log 90.0 450 0.8214 0.3783 0.8214 0.9063
No log 90.4 452 0.8124 0.3783 0.8124 0.9013
No log 90.8 454 0.8050 0.3873 0.8050 0.8972
No log 91.2 456 0.7909 0.3770 0.7909 0.8893
No log 91.6 458 0.7760 0.3539 0.7760 0.8809
No log 92.0 460 0.7671 0.3539 0.7671 0.8758
No log 92.4 462 0.7666 0.3637 0.7666 0.8755
No log 92.8 464 0.7672 0.3637 0.7672 0.8759
No log 93.2 466 0.7713 0.3539 0.7713 0.8782
No log 93.6 468 0.7803 0.3443 0.7803 0.8833
No log 94.0 470 0.7884 0.3243 0.7884 0.8879
No log 94.4 472 0.7929 0.3574 0.7929 0.8904
No log 94.8 474 0.7924 0.3574 0.7924 0.8902
No log 95.2 476 0.7891 0.3574 0.7891 0.8883
No log 95.6 478 0.7872 0.3663 0.7872 0.8872
No log 96.0 480 0.7873 0.3663 0.7873 0.8873
No log 96.4 482 0.7894 0.3754 0.7894 0.8885
No log 96.8 484 0.7885 0.3754 0.7885 0.8880
No log 97.2 486 0.7905 0.3754 0.7905 0.8891
No log 97.6 488 0.7942 0.3754 0.7942 0.8912
No log 98.0 490 0.7975 0.3663 0.7975 0.8930
No log 98.4 492 0.7996 0.3663 0.7996 0.8942
No log 98.8 494 0.8012 0.3574 0.8012 0.8951
No log 99.2 496 0.8026 0.3574 0.8026 0.8959
No log 99.6 498 0.8027 0.3574 0.8027 0.8959
0.2205 100.0 500 0.8028 0.3574 0.8028 0.8960

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k1_task8_organization

Finetuned
(4204)
this model