ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k5_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6460
  • Qwk: 0.5185
  • Mse: 0.6460
  • Rmse: 0.8038

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0714 2 3.9875 -0.0133 3.9875 1.9969
No log 0.1429 4 2.0803 0.0633 2.0803 1.4423
No log 0.2143 6 1.2842 0.0553 1.2842 1.1332
No log 0.2857 8 1.2169 0.0941 1.2169 1.1032
No log 0.3571 10 1.0315 0.1783 1.0315 1.0157
No log 0.4286 12 1.0349 0.1545 1.0349 1.0173
No log 0.5 14 1.0701 0.2135 1.0701 1.0344
No log 0.5714 16 1.0900 0.2301 1.0900 1.0441
No log 0.6429 18 1.0659 0.2795 1.0659 1.0324
No log 0.7143 20 1.0526 0.2497 1.0526 1.0260
No log 0.7857 22 0.9694 0.2643 0.9694 0.9846
No log 0.8571 24 0.9540 0.4450 0.9540 0.9767
No log 0.9286 26 0.9815 0.4091 0.9815 0.9907
No log 1.0 28 0.9082 0.4681 0.9082 0.9530
No log 1.0714 30 0.8171 0.4398 0.8171 0.9040
No log 1.1429 32 0.8113 0.4286 0.8113 0.9007
No log 1.2143 34 0.8288 0.5186 0.8288 0.9104
No log 1.2857 36 0.8338 0.5498 0.8338 0.9131
No log 1.3571 38 0.8859 0.4915 0.8859 0.9412
No log 1.4286 40 0.8174 0.5912 0.8174 0.9041
No log 1.5 42 0.7695 0.5740 0.7695 0.8772
No log 1.5714 44 0.7634 0.4374 0.7634 0.8737
No log 1.6429 46 1.0229 0.4391 1.0229 1.0114
No log 1.7143 48 0.8788 0.4439 0.8788 0.9375
No log 1.7857 50 0.6918 0.4691 0.6918 0.8317
No log 1.8571 52 0.6978 0.5343 0.6978 0.8353
No log 1.9286 54 0.7115 0.5342 0.7115 0.8435
No log 2.0 56 0.8274 0.5117 0.8274 0.9096
No log 2.0714 58 1.0183 0.4988 1.0183 1.0091
No log 2.1429 60 0.9499 0.4844 0.9499 0.9746
No log 2.2143 62 0.7156 0.5671 0.7156 0.8459
No log 2.2857 64 0.7637 0.5746 0.7637 0.8739
No log 2.3571 66 0.7458 0.5565 0.7458 0.8636
No log 2.4286 68 0.7023 0.5203 0.7023 0.8380
No log 2.5 70 0.6761 0.5820 0.6761 0.8222
No log 2.5714 72 0.6996 0.4953 0.6996 0.8364
No log 2.6429 74 0.8280 0.5782 0.8280 0.9099
No log 2.7143 76 0.9543 0.5069 0.9543 0.9769
No log 2.7857 78 0.8144 0.5471 0.8144 0.9024
No log 2.8571 80 0.6766 0.5427 0.6766 0.8225
No log 2.9286 82 0.7072 0.5681 0.7072 0.8409
No log 3.0 84 0.7249 0.6008 0.7249 0.8514
No log 3.0714 86 0.6938 0.5702 0.6938 0.8329
No log 3.1429 88 0.7444 0.5582 0.7444 0.8628
No log 3.2143 90 0.7456 0.5849 0.7456 0.8635
No log 3.2857 92 0.6523 0.5635 0.6523 0.8076
No log 3.3571 94 0.6519 0.5383 0.6519 0.8074
No log 3.4286 96 0.7027 0.6148 0.7027 0.8383
No log 3.5 98 0.7927 0.6097 0.7927 0.8903
No log 3.5714 100 0.7637 0.6279 0.7637 0.8739
No log 3.6429 102 0.7228 0.6411 0.7228 0.8502
No log 3.7143 104 0.7138 0.6696 0.7138 0.8449
No log 3.7857 106 0.6906 0.6177 0.6906 0.8310
No log 3.8571 108 0.8414 0.5625 0.8414 0.9173
No log 3.9286 110 0.9232 0.5638 0.9232 0.9609
No log 4.0 112 0.8830 0.5458 0.8830 0.9397
No log 4.0714 114 0.7362 0.5323 0.7362 0.8580
No log 4.1429 116 0.6513 0.6491 0.6513 0.8071
No log 4.2143 118 0.6681 0.5926 0.6681 0.8174
No log 4.2857 120 0.8542 0.5653 0.8542 0.9242
No log 4.3571 122 1.0059 0.5183 1.0059 1.0029
No log 4.4286 124 0.8413 0.5653 0.8413 0.9172
No log 4.5 126 0.6631 0.6115 0.6631 0.8143
No log 4.5714 128 0.6766 0.6214 0.6766 0.8225
No log 4.6429 130 0.7196 0.5933 0.7196 0.8483
No log 4.7143 132 0.7347 0.6168 0.7347 0.8571
No log 4.7857 134 0.6883 0.5937 0.6883 0.8296
No log 4.8571 136 0.6973 0.6178 0.6973 0.8350
No log 4.9286 138 0.6900 0.5563 0.6900 0.8307
No log 5.0 140 0.6947 0.5759 0.6947 0.8335
No log 5.0714 142 0.7962 0.4460 0.7962 0.8923
No log 5.1429 144 0.8821 0.5384 0.8821 0.9392
No log 5.2143 146 0.7535 0.5424 0.7535 0.8681
No log 5.2857 148 0.6842 0.5827 0.6842 0.8272
No log 5.3571 150 0.6773 0.5614 0.6773 0.8230
No log 5.4286 152 0.6645 0.5711 0.6645 0.8151
No log 5.5 154 0.6592 0.5402 0.6592 0.8119
No log 5.5714 156 0.7093 0.5103 0.7093 0.8422
No log 5.6429 158 0.6975 0.5571 0.6975 0.8352
No log 5.7143 160 0.7017 0.5981 0.7017 0.8377
No log 5.7857 162 0.6673 0.6362 0.6673 0.8169
No log 5.8571 164 0.5903 0.6144 0.5903 0.7683
No log 5.9286 166 0.5936 0.6330 0.5936 0.7704
No log 6.0 168 0.6154 0.6133 0.6154 0.7845
No log 6.0714 170 0.6346 0.5660 0.6346 0.7966
No log 6.1429 172 0.6441 0.5563 0.6441 0.8025
No log 6.2143 174 0.6482 0.5809 0.6482 0.8051
No log 6.2857 176 0.6561 0.4692 0.6561 0.8100
No log 6.3571 178 0.6449 0.4810 0.6449 0.8030
No log 6.4286 180 0.6512 0.6143 0.6512 0.8070
No log 6.5 182 0.6874 0.5699 0.6874 0.8291
No log 6.5714 184 0.6550 0.5919 0.6550 0.8093
No log 6.6429 186 0.6346 0.6039 0.6346 0.7966
No log 6.7143 188 0.6433 0.5977 0.6433 0.8021
No log 6.7857 190 0.6484 0.6144 0.6484 0.8053
No log 6.8571 192 0.6454 0.6772 0.6454 0.8034
No log 6.9286 194 0.6392 0.6151 0.6392 0.7995
No log 7.0 196 0.6582 0.5777 0.6582 0.8113
No log 7.0714 198 0.6354 0.5978 0.6354 0.7971
No log 7.1429 200 0.6094 0.5759 0.6094 0.7807
No log 7.2143 202 0.8118 0.5888 0.8118 0.9010
No log 7.2857 204 0.9799 0.5273 0.9799 0.9899
No log 7.3571 206 0.8806 0.5436 0.8806 0.9384
No log 7.4286 208 0.6506 0.6893 0.6506 0.8066
No log 7.5 210 0.6131 0.6623 0.6131 0.7830
No log 7.5714 212 0.7722 0.5917 0.7722 0.8787
No log 7.6429 214 0.8697 0.5694 0.8697 0.9326
No log 7.7143 216 0.7976 0.5531 0.7976 0.8931
No log 7.7857 218 0.6616 0.5862 0.6616 0.8134
No log 7.8571 220 0.6109 0.5724 0.6109 0.7816
No log 7.9286 222 0.6616 0.5653 0.6616 0.8134
No log 8.0 224 0.6917 0.4606 0.6917 0.8317
No log 8.0714 226 0.6644 0.4554 0.6644 0.8151
No log 8.1429 228 0.6453 0.5408 0.6453 0.8033
No log 8.2143 230 0.6797 0.5883 0.6797 0.8245
No log 8.2857 232 0.7134 0.5033 0.7134 0.8446
No log 8.3571 234 0.6804 0.6255 0.6804 0.8249
No log 8.4286 236 0.6367 0.5889 0.6367 0.7980
No log 8.5 238 0.6196 0.5165 0.6196 0.7871
No log 8.5714 240 0.6023 0.4929 0.6023 0.7760
No log 8.6429 242 0.5942 0.5971 0.5942 0.7709
No log 8.7143 244 0.5893 0.6325 0.5893 0.7677
No log 8.7857 246 0.6094 0.6790 0.6094 0.7806
No log 8.8571 248 0.6331 0.6790 0.6331 0.7957
No log 8.9286 250 0.6323 0.6380 0.6323 0.7952
No log 9.0 252 0.5851 0.6415 0.5851 0.7649
No log 9.0714 254 0.5639 0.6307 0.5639 0.7509
No log 9.1429 256 0.5677 0.6096 0.5677 0.7534
No log 9.2143 258 0.5717 0.6025 0.5717 0.7561
No log 9.2857 260 0.5810 0.6219 0.5810 0.7622
No log 9.3571 262 0.6217 0.6249 0.6217 0.7885
No log 9.4286 264 0.6201 0.6063 0.6201 0.7875
No log 9.5 266 0.5913 0.6241 0.5913 0.7690
No log 9.5714 268 0.5807 0.5835 0.5807 0.7620
No log 9.6429 270 0.5979 0.5663 0.5979 0.7733
No log 9.7143 272 0.6131 0.5459 0.6131 0.7830
No log 9.7857 274 0.6148 0.4958 0.6148 0.7841
No log 9.8571 276 0.5989 0.5343 0.5989 0.7739
No log 9.9286 278 0.5850 0.6330 0.5850 0.7649
No log 10.0 280 0.5939 0.6406 0.5939 0.7707
No log 10.0714 282 0.6006 0.6578 0.6006 0.7750
No log 10.1429 284 0.5918 0.6578 0.5918 0.7693
No log 10.2143 286 0.5907 0.6614 0.5907 0.7685
No log 10.2857 288 0.5946 0.6439 0.5946 0.7711
No log 10.3571 290 0.6335 0.6144 0.6335 0.7959
No log 10.4286 292 0.6600 0.6080 0.6600 0.8124
No log 10.5 294 0.6577 0.5905 0.6577 0.8110
No log 10.5714 296 0.6850 0.5684 0.6850 0.8276
No log 10.6429 298 0.6647 0.5432 0.6647 0.8153
No log 10.7143 300 0.6347 0.5405 0.6347 0.7967
No log 10.7857 302 0.6175 0.6035 0.6175 0.7858
No log 10.8571 304 0.6154 0.6320 0.6154 0.7845
No log 10.9286 306 0.6497 0.6246 0.6497 0.8060
No log 11.0 308 0.7198 0.5782 0.7198 0.8484
No log 11.0714 310 0.7015 0.5607 0.7015 0.8376
No log 11.1429 312 0.6899 0.5273 0.6899 0.8306
No log 11.2143 314 0.7523 0.4473 0.7523 0.8673
No log 11.2857 316 0.7913 0.4336 0.7913 0.8896
No log 11.3571 318 0.8204 0.4060 0.8204 0.9058
No log 11.4286 320 0.7996 0.4336 0.7996 0.8942
No log 11.5 322 0.7700 0.4867 0.7700 0.8775
No log 11.5714 324 0.7587 0.4995 0.7587 0.8710
No log 11.6429 326 0.7475 0.5442 0.7475 0.8646
No log 11.7143 328 0.7082 0.5346 0.7082 0.8416
No log 11.7857 330 0.6953 0.5138 0.6953 0.8338
No log 11.8571 332 0.6932 0.4625 0.6932 0.8326
No log 11.9286 334 0.6864 0.5127 0.6864 0.8285
No log 12.0 336 0.6946 0.5645 0.6946 0.8334
No log 12.0714 338 0.6878 0.5540 0.6878 0.8293
No log 12.1429 340 0.6728 0.5359 0.6728 0.8202
No log 12.2143 342 0.6561 0.4659 0.6561 0.8100
No log 12.2857 344 0.6509 0.4938 0.6509 0.8068
No log 12.3571 346 0.6423 0.4938 0.6423 0.8015
No log 12.4286 348 0.6329 0.5415 0.6329 0.7955
No log 12.5 350 0.6337 0.5475 0.6337 0.7961
No log 12.5714 352 0.6320 0.5796 0.6320 0.7950
No log 12.6429 354 0.6446 0.6595 0.6446 0.8029
No log 12.7143 356 0.7041 0.6388 0.7041 0.8391
No log 12.7857 358 0.7158 0.5812 0.7158 0.8460
No log 12.8571 360 0.6686 0.6541 0.6686 0.8177
No log 12.9286 362 0.6232 0.5884 0.6232 0.7895
No log 13.0 364 0.6211 0.5682 0.6211 0.7881
No log 13.0714 366 0.6474 0.6386 0.6474 0.8046
No log 13.1429 368 0.7090 0.6013 0.7090 0.8420
No log 13.2143 370 0.7121 0.5823 0.7121 0.8438
No log 13.2857 372 0.6569 0.6275 0.6569 0.8105
No log 13.3571 374 0.6147 0.5833 0.6147 0.7840
No log 13.4286 376 0.6325 0.6578 0.6325 0.7953
No log 13.5 378 0.6390 0.6539 0.6390 0.7994
No log 13.5714 380 0.6246 0.5960 0.6246 0.7903
No log 13.6429 382 0.6336 0.4807 0.6336 0.7960
No log 13.7143 384 0.6523 0.5313 0.6523 0.8077
No log 13.7857 386 0.6483 0.6080 0.6483 0.8052
No log 13.8571 388 0.6294 0.6302 0.6294 0.7933
No log 13.9286 390 0.6204 0.5731 0.6204 0.7877
No log 14.0 392 0.6394 0.6404 0.6394 0.7996
No log 14.0714 394 0.6283 0.5855 0.6283 0.7927
No log 14.1429 396 0.6185 0.5902 0.6185 0.7865
No log 14.2143 398 0.6470 0.4692 0.6470 0.8044
No log 14.2857 400 0.6739 0.4590 0.6739 0.8209
No log 14.3571 402 0.6562 0.4723 0.6562 0.8101
No log 14.4286 404 0.6275 0.5530 0.6275 0.7922
No log 14.5 406 0.6292 0.5960 0.6292 0.7932
No log 14.5714 408 0.6657 0.6361 0.6657 0.8159
No log 14.6429 410 0.7186 0.5610 0.7186 0.8477
No log 14.7143 412 0.7105 0.5559 0.7105 0.8429
No log 14.7857 414 0.6958 0.5380 0.6958 0.8341
No log 14.8571 416 0.6699 0.5412 0.6699 0.8185
No log 14.9286 418 0.6375 0.5168 0.6375 0.7984
No log 15.0 420 0.6177 0.6383 0.6177 0.7859
No log 15.0714 422 0.6386 0.5737 0.6386 0.7991
No log 15.1429 424 0.6429 0.6019 0.6429 0.8018
No log 15.2143 426 0.6237 0.6410 0.6237 0.7898
No log 15.2857 428 0.6201 0.6225 0.6201 0.7874
No log 15.3571 430 0.6372 0.5759 0.6372 0.7982
No log 15.4286 432 0.6584 0.5089 0.6584 0.8114
No log 15.5 434 0.6488 0.5063 0.6488 0.8055
No log 15.5714 436 0.6485 0.4929 0.6485 0.8053
No log 15.6429 438 0.6530 0.5678 0.6530 0.8081
No log 15.7143 440 0.6680 0.5317 0.6680 0.8173
No log 15.7857 442 0.6514 0.5678 0.6514 0.8071
No log 15.8571 444 0.6481 0.5421 0.6481 0.8051
No log 15.9286 446 0.6895 0.5560 0.6895 0.8304
No log 16.0 448 0.7626 0.5570 0.7626 0.8733
No log 16.0714 450 0.7795 0.5570 0.7795 0.8829
No log 16.1429 452 0.7267 0.5570 0.7267 0.8525
No log 16.2143 454 0.6398 0.5955 0.6398 0.7999
No log 16.2857 456 0.6023 0.5859 0.6023 0.7761
No log 16.3571 458 0.6180 0.5874 0.6180 0.7862
No log 16.4286 460 0.6175 0.5288 0.6175 0.7858
No log 16.5 462 0.6175 0.5168 0.6175 0.7858
No log 16.5714 464 0.6178 0.5432 0.6178 0.7860
No log 16.6429 466 0.6236 0.5809 0.6236 0.7897
No log 16.7143 468 0.6169 0.5809 0.6169 0.7854
No log 16.7857 470 0.6055 0.5432 0.6055 0.7782
No log 16.8571 472 0.6241 0.6054 0.6241 0.7900
No log 16.9286 474 0.6330 0.6405 0.6330 0.7956
No log 17.0 476 0.6227 0.6054 0.6227 0.7891
No log 17.0714 478 0.6107 0.6084 0.6107 0.7814
No log 17.1429 480 0.6068 0.5536 0.6068 0.7790
No log 17.2143 482 0.6294 0.5202 0.6294 0.7933
No log 17.2857 484 0.6972 0.5353 0.6972 0.8350
No log 17.3571 486 0.7292 0.5688 0.7292 0.8539
No log 17.4286 488 0.7063 0.5898 0.7063 0.8404
No log 17.5 490 0.6235 0.5434 0.6235 0.7896
No log 17.5714 492 0.5984 0.5635 0.5984 0.7736
No log 17.6429 494 0.6246 0.6084 0.6246 0.7903
No log 17.7143 496 0.6643 0.5883 0.6643 0.8150
No log 17.7857 498 0.6746 0.5103 0.6746 0.8213
0.1911 17.8571 500 0.6394 0.6117 0.6394 0.7996
0.1911 17.9286 502 0.6101 0.6001 0.6101 0.7811
0.1911 18.0 504 0.6109 0.5548 0.6109 0.7816
0.1911 18.0714 506 0.6671 0.5586 0.6671 0.8168
0.1911 18.1429 508 0.6746 0.5898 0.6746 0.8214
0.1911 18.2143 510 0.6170 0.6288 0.6170 0.7855
0.1911 18.2857 512 0.5867 0.5961 0.5867 0.7660
0.1911 18.3571 514 0.5863 0.6219 0.5863 0.7657
0.1911 18.4286 516 0.5996 0.6117 0.5996 0.7743
0.1911 18.5 518 0.6148 0.6117 0.6148 0.7841
0.1911 18.5714 520 0.6220 0.5902 0.6220 0.7887
0.1911 18.6429 522 0.6389 0.5549 0.6389 0.7993
0.1911 18.7143 524 0.6460 0.5185 0.6460 0.8038

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k5_task5_organization

Finetuned
(4222)
this model