ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7558
  • Qwk: 0.4912
  • Mse: 0.7558
  • Rmse: 0.8694

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0357 2 4.5778 0.0010 4.5778 2.1396
No log 0.0714 4 2.4450 0.0765 2.4450 1.5636
No log 0.1071 6 1.6082 0.0372 1.6082 1.2682
No log 0.1429 8 1.3722 0.0334 1.3722 1.1714
No log 0.1786 10 1.3714 0.1142 1.3714 1.1711
No log 0.2143 12 1.2132 0.1242 1.2132 1.1015
No log 0.25 14 1.3941 0.0639 1.3941 1.1807
No log 0.2857 16 2.0255 -0.0015 2.0255 1.4232
No log 0.3214 18 2.1201 0.0519 2.1201 1.4561
No log 0.3571 20 1.5045 0.1143 1.5045 1.2266
No log 0.3929 22 1.3545 0.0596 1.3545 1.1638
No log 0.4286 24 1.3499 0.1322 1.3499 1.1619
No log 0.4643 26 1.5014 0.0766 1.5014 1.2253
No log 0.5 28 1.5386 0.1568 1.5386 1.2404
No log 0.5357 30 1.3324 0.2044 1.3324 1.1543
No log 0.5714 32 1.0973 0.3107 1.0973 1.0475
No log 0.6071 34 1.0543 0.3750 1.0543 1.0268
No log 0.6429 36 1.0674 0.3397 1.0674 1.0332
No log 0.6786 38 1.2272 0.2010 1.2272 1.1078
No log 0.7143 40 1.2214 0.1921 1.2214 1.1052
No log 0.75 42 0.9740 0.4098 0.9740 0.9869
No log 0.7857 44 0.9355 0.4158 0.9355 0.9672
No log 0.8214 46 0.9552 0.4730 0.9552 0.9773
No log 0.8571 48 1.2773 0.2201 1.2773 1.1302
No log 0.8929 50 1.6688 0.1621 1.6688 1.2918
No log 0.9286 52 1.6931 0.1439 1.6931 1.3012
No log 0.9643 54 1.3719 0.1674 1.3719 1.1713
No log 1.0 56 0.9093 0.5075 0.9093 0.9536
No log 1.0357 58 0.8299 0.5205 0.8299 0.9110
No log 1.0714 60 0.8695 0.5590 0.8695 0.9325
No log 1.1071 62 0.9272 0.4741 0.9272 0.9629
No log 1.1429 64 0.8679 0.4806 0.8679 0.9316
No log 1.1786 66 0.7978 0.4527 0.7978 0.8932
No log 1.2143 68 0.7971 0.4726 0.7971 0.8928
No log 1.25 70 0.8196 0.4599 0.8196 0.9053
No log 1.2857 72 0.8494 0.4534 0.8494 0.9217
No log 1.3214 74 0.9760 0.4578 0.9760 0.9879
No log 1.3571 76 0.9803 0.4713 0.9803 0.9901
No log 1.3929 78 0.9083 0.5190 0.9083 0.9531
No log 1.4286 80 0.8620 0.5756 0.8620 0.9285
No log 1.4643 82 0.9903 0.5328 0.9903 0.9951
No log 1.5 84 0.8667 0.5624 0.8667 0.9310
No log 1.5357 86 0.9548 0.5332 0.9548 0.9772
No log 1.5714 88 1.1931 0.3581 1.1931 1.0923
No log 1.6071 90 1.0565 0.4318 1.0565 1.0278
No log 1.6429 92 0.9490 0.4507 0.9490 0.9741
No log 1.6786 94 0.8892 0.4158 0.8892 0.9430
No log 1.7143 96 0.8714 0.3987 0.8714 0.9335
No log 1.75 98 0.8711 0.4916 0.8711 0.9333
No log 1.7857 100 1.0889 0.4352 1.0889 1.0435
No log 1.8214 102 1.0654 0.4352 1.0654 1.0322
No log 1.8571 104 0.8659 0.5387 0.8659 0.9306
No log 1.8929 106 0.8333 0.4440 0.8333 0.9128
No log 1.9286 108 0.8467 0.3811 0.8467 0.9202
No log 1.9643 110 0.8790 0.4681 0.8790 0.9376
No log 2.0 112 0.9509 0.4666 0.9509 0.9752
No log 2.0357 114 0.9465 0.5145 0.9465 0.9729
No log 2.0714 116 0.9613 0.5345 0.9613 0.9805
No log 2.1071 118 0.9997 0.5363 0.9997 0.9999
No log 2.1429 120 0.9446 0.5063 0.9446 0.9719
No log 2.1786 122 0.9464 0.4826 0.9464 0.9728
No log 2.2143 124 0.9405 0.4417 0.9405 0.9698
No log 2.25 126 0.9079 0.3550 0.9079 0.9529
No log 2.2857 128 0.8921 0.3804 0.8921 0.9445
No log 2.3214 130 0.8893 0.3590 0.8893 0.9430
No log 2.3571 132 0.8665 0.3812 0.8665 0.9309
No log 2.3929 134 0.8244 0.3938 0.8244 0.9080
No log 2.4286 136 0.8412 0.4 0.8412 0.9172
No log 2.4643 138 0.7699 0.5245 0.7699 0.8775
No log 2.5 140 0.7222 0.5984 0.7222 0.8498
No log 2.5357 142 0.7133 0.6151 0.7133 0.8446
No log 2.5714 144 0.7193 0.6411 0.7193 0.8481
No log 2.6071 146 0.7134 0.5946 0.7134 0.8446
No log 2.6429 148 0.7576 0.5463 0.7576 0.8704
No log 2.6786 150 0.8882 0.5279 0.8882 0.9424
No log 2.7143 152 0.9262 0.4974 0.9262 0.9624
No log 2.75 154 0.8774 0.4849 0.8774 0.9367
No log 2.7857 156 0.7534 0.5948 0.7534 0.8680
No log 2.8214 158 0.7920 0.6453 0.7920 0.8899
No log 2.8571 160 0.7940 0.6555 0.7940 0.8911
No log 2.8929 162 0.7593 0.5779 0.7593 0.8714
No log 2.9286 164 0.8036 0.5069 0.8036 0.8965
No log 2.9643 166 0.9394 0.5203 0.9394 0.9692
No log 3.0 168 0.9902 0.5111 0.9902 0.9951
No log 3.0357 170 0.8866 0.5258 0.8866 0.9416
No log 3.0714 172 0.7977 0.4465 0.7977 0.8932
No log 3.1071 174 0.7616 0.5262 0.7616 0.8727
No log 3.1429 176 0.7523 0.5262 0.7523 0.8673
No log 3.1786 178 0.7690 0.4400 0.7690 0.8769
No log 3.2143 180 0.7606 0.5085 0.7606 0.8721
No log 3.25 182 0.7944 0.4439 0.7944 0.8913
No log 3.2857 184 0.8523 0.4439 0.8523 0.9232
No log 3.3214 186 0.7964 0.4681 0.7964 0.8924
No log 3.3571 188 0.7900 0.4726 0.7900 0.8888
No log 3.3929 190 0.8061 0.4299 0.8061 0.8978
No log 3.4286 192 0.8879 0.4745 0.8879 0.9423
No log 3.4643 194 0.8293 0.4156 0.8293 0.9106
No log 3.5 196 0.7536 0.6049 0.7536 0.8681
No log 3.5357 198 0.7896 0.5899 0.7896 0.8886
No log 3.5714 200 0.7693 0.6409 0.7693 0.8771
No log 3.6071 202 0.7726 0.5342 0.7726 0.8790
No log 3.6429 204 0.8663 0.5203 0.8663 0.9308
No log 3.6786 206 0.9842 0.4627 0.9842 0.9921
No log 3.7143 208 0.9769 0.4627 0.9769 0.9884
No log 3.75 210 0.9286 0.5128 0.9286 0.9637
No log 3.7857 212 0.9094 0.5056 0.9094 0.9536
No log 3.8214 214 0.8821 0.5158 0.8821 0.9392
No log 3.8571 216 0.8609 0.4787 0.8609 0.9278
No log 3.8929 218 0.7918 0.4874 0.7918 0.8898
No log 3.9286 220 0.7363 0.5093 0.7363 0.8581
No log 3.9643 222 0.7219 0.5329 0.7219 0.8497
No log 4.0 224 0.7343 0.4912 0.7343 0.8569
No log 4.0357 226 0.7754 0.4772 0.7754 0.8806
No log 4.0714 228 0.7932 0.4772 0.7932 0.8906
No log 4.1071 230 0.8014 0.4434 0.8014 0.8952
No log 4.1429 232 0.7930 0.4434 0.7930 0.8905
No log 4.1786 234 0.8294 0.4957 0.8294 0.9107
No log 4.2143 236 0.8749 0.5065 0.8749 0.9354
No log 4.25 238 0.8099 0.4912 0.8099 0.8999
No log 4.2857 240 0.7474 0.4491 0.7474 0.8645
No log 4.3214 242 0.7309 0.5479 0.7309 0.8549
No log 4.3571 244 0.7440 0.4671 0.7440 0.8626
No log 4.3929 246 0.7711 0.5340 0.7711 0.8781
No log 4.4286 248 0.7096 0.5125 0.7096 0.8424
No log 4.4643 250 0.6867 0.5690 0.6867 0.8286
No log 4.5 252 0.7144 0.5869 0.7144 0.8452
No log 4.5357 254 0.8450 0.5360 0.8450 0.9192
No log 4.5714 256 0.8446 0.5156 0.8446 0.9190
No log 4.6071 258 0.9036 0.4928 0.9036 0.9506
No log 4.6429 260 0.8128 0.5256 0.8128 0.9016
No log 4.6786 262 0.7319 0.5563 0.7319 0.8555
No log 4.7143 264 0.7280 0.5810 0.7280 0.8532
No log 4.75 266 0.7397 0.5501 0.7397 0.8601
No log 4.7857 268 0.7485 0.5027 0.7485 0.8651
No log 4.8214 270 0.7735 0.4498 0.7735 0.8795
No log 4.8571 272 0.7765 0.4498 0.7765 0.8812
No log 4.8929 274 0.7807 0.4498 0.7807 0.8836
No log 4.9286 276 0.7847 0.4898 0.7847 0.8859
No log 4.9643 278 0.8400 0.4983 0.8400 0.9165
No log 5.0 280 0.8812 0.5313 0.8812 0.9387
No log 5.0357 282 0.8424 0.4785 0.8424 0.9178
No log 5.0714 284 0.8440 0.3974 0.8440 0.9187
No log 5.1071 286 0.8674 0.3960 0.8674 0.9313
No log 5.1429 288 0.8607 0.3960 0.8607 0.9277
No log 5.1786 290 0.8307 0.5308 0.8307 0.9114
No log 5.2143 292 0.8842 0.4940 0.8842 0.9403
No log 5.25 294 0.8957 0.5218 0.8957 0.9464
No log 5.2857 296 0.8417 0.4963 0.8417 0.9174
No log 5.3214 298 0.8304 0.4665 0.8304 0.9113
No log 5.3571 300 0.8410 0.3914 0.8410 0.9170
No log 5.3929 302 0.8308 0.3914 0.8308 0.9115
No log 5.4286 304 0.8157 0.4769 0.8157 0.9032
No log 5.4643 306 0.8228 0.4799 0.8228 0.9071
No log 5.5 308 0.8148 0.4983 0.8148 0.9027
No log 5.5357 310 0.7716 0.5192 0.7716 0.8784
No log 5.5714 312 0.7937 0.4681 0.7937 0.8909
No log 5.6071 314 0.8826 0.4660 0.8826 0.9395
No log 5.6429 316 0.8839 0.4764 0.8839 0.9401
No log 5.6786 318 0.7805 0.4681 0.7805 0.8835
No log 5.7143 320 0.7460 0.5351 0.7460 0.8637
No log 5.75 322 0.7572 0.5727 0.7572 0.8702
No log 5.7857 324 0.7604 0.5374 0.7604 0.8720
No log 5.8214 326 0.7944 0.4324 0.7944 0.8913
No log 5.8571 328 0.8412 0.4608 0.8412 0.9172
No log 5.8929 330 0.8793 0.4854 0.8793 0.9377
No log 5.9286 332 0.8342 0.4706 0.8342 0.9133
No log 5.9643 334 0.8121 0.5188 0.8121 0.9012
No log 6.0 336 0.8611 0.5576 0.8611 0.9280
No log 6.0357 338 0.8515 0.5576 0.8515 0.9228
No log 6.0714 340 0.8621 0.5574 0.8621 0.9285
No log 6.1071 342 0.8385 0.5796 0.8385 0.9157
No log 6.1429 344 0.7743 0.5837 0.7743 0.8800
No log 6.1786 346 0.7262 0.5738 0.7262 0.8521
No log 6.2143 348 0.7421 0.5439 0.7421 0.8614
No log 6.25 350 0.7658 0.5688 0.7658 0.8751
No log 6.2857 352 0.7940 0.5635 0.7940 0.8911
No log 6.3214 354 0.8124 0.5308 0.8124 0.9013
No log 6.3571 356 0.8373 0.4356 0.8373 0.9151
No log 6.3929 358 0.8406 0.4691 0.8406 0.9168
No log 6.4286 360 0.8433 0.4701 0.8433 0.9183
No log 6.4643 362 0.8499 0.3771 0.8499 0.9219
No log 6.5 364 0.8499 0.4663 0.8499 0.9219
No log 6.5357 366 0.8062 0.4724 0.8062 0.8979
No log 6.5714 368 0.8133 0.5684 0.8133 0.9018
No log 6.6071 370 0.8313 0.5712 0.8313 0.9117
No log 6.6429 372 0.7848 0.4898 0.7848 0.8859
No log 6.6786 374 0.7683 0.4780 0.7683 0.8765
No log 6.7143 376 0.7816 0.4512 0.7816 0.8841
No log 6.75 378 0.7797 0.4646 0.7797 0.8830
No log 6.7857 380 0.7831 0.4839 0.7831 0.8850
No log 6.8214 382 0.8271 0.4741 0.8271 0.9095
No log 6.8571 384 0.8742 0.5120 0.8742 0.9350
No log 6.8929 386 0.8088 0.4741 0.8088 0.8994
No log 6.9286 388 0.7804 0.5041 0.7804 0.8834
No log 6.9643 390 0.8043 0.5238 0.8043 0.8968
No log 7.0 392 0.8053 0.5190 0.8053 0.8974
No log 7.0357 394 0.8170 0.5524 0.8170 0.9039
No log 7.0714 396 0.8578 0.5887 0.8578 0.9262
No log 7.1071 398 0.8992 0.5366 0.8992 0.9483
No log 7.1429 400 0.8918 0.5670 0.8918 0.9444
No log 7.1786 402 0.8615 0.5856 0.8615 0.9282
No log 7.2143 404 0.8339 0.4690 0.8339 0.9132
No log 7.25 406 0.8332 0.4889 0.8332 0.9128
No log 7.2857 408 0.8341 0.4705 0.8341 0.9133
No log 7.3214 410 0.8388 0.4316 0.8388 0.9159
No log 7.3571 412 0.8508 0.4637 0.8508 0.9224
No log 7.3929 414 0.8485 0.4349 0.8485 0.9211
No log 7.4286 416 0.8576 0.4640 0.8576 0.9261
No log 7.4643 418 0.8694 0.4400 0.8694 0.9324
No log 7.5 420 0.8671 0.4297 0.8671 0.9312
No log 7.5357 422 0.8685 0.4075 0.8685 0.9320
No log 7.5714 424 0.8740 0.4570 0.8740 0.9349
No log 7.6071 426 0.8731 0.4648 0.8731 0.9344
No log 7.6429 428 0.8830 0.4885 0.8830 0.9397
No log 7.6786 430 0.8885 0.5219 0.8885 0.9426
No log 7.7143 432 0.8874 0.4368 0.8874 0.9420
No log 7.75 434 0.8700 0.3873 0.8700 0.9327
No log 7.7857 436 0.8842 0.3834 0.8842 0.9403
No log 7.8214 438 0.8960 0.4074 0.8960 0.9466
No log 7.8571 440 0.8613 0.4951 0.8613 0.9281
No log 7.8929 442 0.8464 0.4485 0.8464 0.9200
No log 7.9286 444 0.8568 0.5659 0.8568 0.9256
No log 7.9643 446 0.8417 0.5659 0.8417 0.9175
No log 8.0 448 0.8134 0.4994 0.8134 0.9019
No log 8.0357 450 0.8007 0.5365 0.8007 0.8948
No log 8.0714 452 0.8095 0.5406 0.8095 0.8997
No log 8.1071 454 0.7887 0.5220 0.7887 0.8881
No log 8.1429 456 0.7719 0.5621 0.7719 0.8786
No log 8.1786 458 0.7742 0.5581 0.7742 0.8799
No log 8.2143 460 0.7722 0.5581 0.7722 0.8787
No log 8.25 462 0.7726 0.5581 0.7726 0.8790
No log 8.2857 464 0.7708 0.5364 0.7708 0.8779
No log 8.3214 466 0.7740 0.5364 0.7740 0.8797
No log 8.3571 468 0.7727 0.5387 0.7727 0.8791
No log 8.3929 470 0.7720 0.5327 0.7720 0.8787
No log 8.4286 472 0.7598 0.5540 0.7598 0.8717
No log 8.4643 474 0.7565 0.5138 0.7565 0.8698
No log 8.5 476 0.7520 0.5138 0.7520 0.8672
No log 8.5357 478 0.7322 0.5343 0.7322 0.8557
No log 8.5714 480 0.7717 0.6500 0.7717 0.8785
No log 8.6071 482 0.8398 0.5987 0.8398 0.9164
No log 8.6429 484 0.8041 0.6008 0.8041 0.8967
No log 8.6786 486 0.7574 0.5408 0.7574 0.8703
No log 8.7143 488 0.7647 0.5125 0.7647 0.8744
No log 8.75 490 0.7868 0.5618 0.7868 0.8870
No log 8.7857 492 0.7642 0.5497 0.7642 0.8742
No log 8.8214 494 0.7556 0.5911 0.7556 0.8692
No log 8.8571 496 0.7716 0.6472 0.7716 0.8784
No log 8.8929 498 0.7850 0.6314 0.7850 0.8860
0.2444 8.9286 500 0.7844 0.6233 0.7844 0.8857
0.2444 8.9643 502 0.7547 0.5785 0.7547 0.8688
0.2444 9.0 504 0.7530 0.6085 0.7530 0.8677
0.2444 9.0357 506 0.7583 0.5291 0.7583 0.8708
0.2444 9.0714 508 0.7536 0.5266 0.7536 0.8681
0.2444 9.1071 510 0.7558 0.4912 0.7558 0.8694

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k10_task2_organization

Finetuned
(4222)
this model