ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k20_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0239
  • Qwk: 0.1951
  • Mse: 1.0239
  • Rmse: 1.0119

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0196 2 4.2821 0.0079 4.2821 2.0693
No log 0.0392 4 2.4968 0.0543 2.4968 1.5801
No log 0.0588 6 1.7054 -0.0018 1.7054 1.3059
No log 0.0784 8 1.1373 0.2515 1.1373 1.0664
No log 0.0980 10 1.2523 0.0011 1.2523 1.1191
No log 0.1176 12 1.3792 -0.0428 1.3792 1.1744
No log 0.1373 14 1.3811 -0.0548 1.3811 1.1752
No log 0.1569 16 1.5603 -0.0743 1.5603 1.2491
No log 0.1765 18 1.4793 -0.0743 1.4793 1.2163
No log 0.1961 20 1.2598 -0.0311 1.2598 1.1224
No log 0.2157 22 1.1584 0.1881 1.1584 1.0763
No log 0.2353 24 1.1273 0.1826 1.1273 1.0617
No log 0.2549 26 1.0525 0.2467 1.0525 1.0259
No log 0.2745 28 1.0446 0.2023 1.0446 1.0220
No log 0.2941 30 1.1492 0.1324 1.1492 1.0720
No log 0.3137 32 1.0030 0.2416 1.0030 1.0015
No log 0.3333 34 0.9832 0.3549 0.9832 0.9916
No log 0.3529 36 0.9614 0.4002 0.9614 0.9805
No log 0.3725 38 0.9009 0.3448 0.9009 0.9492
No log 0.3922 40 1.2227 0.2395 1.2227 1.1058
No log 0.4118 42 1.2504 0.2038 1.2504 1.1182
No log 0.4314 44 0.9177 0.3933 0.9177 0.9580
No log 0.4510 46 1.3788 0.3543 1.3788 1.1742
No log 0.4706 48 1.8917 0.1889 1.8917 1.3754
No log 0.4902 50 1.6280 0.2921 1.6280 1.2759
No log 0.5098 52 1.1316 0.2991 1.1316 1.0638
No log 0.5294 54 0.9586 0.4522 0.9586 0.9791
No log 0.5490 56 0.9551 0.2549 0.9551 0.9773
No log 0.5686 58 0.9365 0.3278 0.9365 0.9677
No log 0.5882 60 0.9504 0.4136 0.9504 0.9749
No log 0.6078 62 1.2522 0.2457 1.2522 1.1190
No log 0.6275 64 1.4552 0.2103 1.4552 1.2063
No log 0.6471 66 1.2670 0.3616 1.2670 1.1256
No log 0.6667 68 1.1042 0.3405 1.1042 1.0508
No log 0.6863 70 1.0147 0.2826 1.0147 1.0073
No log 0.7059 72 0.9395 0.3879 0.9395 0.9693
No log 0.7255 74 0.8810 0.4251 0.8810 0.9386
No log 0.7451 76 0.8574 0.4277 0.8574 0.9260
No log 0.7647 78 0.8716 0.4078 0.8716 0.9336
No log 0.7843 80 0.9465 0.3415 0.9465 0.9729
No log 0.8039 82 1.1232 0.3101 1.1232 1.0598
No log 0.8235 84 1.0863 0.3220 1.0863 1.0423
No log 0.8431 86 0.8406 0.3577 0.8406 0.9169
No log 0.8627 88 0.8113 0.4409 0.8113 0.9007
No log 0.8824 90 0.9661 0.3229 0.9661 0.9829
No log 0.9020 92 0.9064 0.3396 0.9064 0.9521
No log 0.9216 94 0.7896 0.4642 0.7896 0.8886
No log 0.9412 96 1.0308 0.3185 1.0308 1.0153
No log 0.9608 98 1.3470 0.3022 1.3470 1.1606
No log 0.9804 100 1.3495 0.2958 1.3495 1.1617
No log 1.0 102 1.1178 0.3295 1.1178 1.0573
No log 1.0196 104 0.8352 0.5610 0.8352 0.9139
No log 1.0392 106 0.7427 0.4391 0.7427 0.8618
No log 1.0588 108 0.7737 0.5082 0.7737 0.8796
No log 1.0784 110 0.7372 0.4658 0.7372 0.8586
No log 1.0980 112 0.7295 0.5117 0.7295 0.8541
No log 1.1176 114 0.9234 0.4987 0.9234 0.9609
No log 1.1373 116 1.1008 0.4140 1.1008 1.0492
No log 1.1569 118 1.0237 0.3624 1.0237 1.0118
No log 1.1765 120 0.8729 0.4800 0.8729 0.9343
No log 1.1961 122 0.7957 0.4608 0.7957 0.8920
No log 1.2157 124 0.7736 0.4608 0.7736 0.8795
No log 1.2353 126 0.8014 0.5123 0.8014 0.8952
No log 1.2549 128 0.7942 0.5123 0.7942 0.8912
No log 1.2745 130 0.7855 0.5123 0.7855 0.8863
No log 1.2941 132 0.8449 0.4697 0.8449 0.9192
No log 1.3137 134 0.8874 0.3453 0.8874 0.9420
No log 1.3333 136 0.8653 0.3883 0.8653 0.9302
No log 1.3529 138 0.8568 0.3883 0.8568 0.9256
No log 1.3725 140 0.8319 0.4300 0.8319 0.9121
No log 1.3922 142 0.8050 0.4574 0.8050 0.8972
No log 1.4118 144 0.8229 0.4681 0.8229 0.9072
No log 1.4314 146 0.8607 0.4404 0.8607 0.9277
No log 1.4510 148 0.8790 0.4404 0.8790 0.9376
No log 1.4706 150 0.9078 0.4265 0.9078 0.9528
No log 1.4902 152 0.8777 0.3863 0.8777 0.9369
No log 1.5098 154 0.8554 0.3863 0.8554 0.9249
No log 1.5294 156 0.8453 0.4404 0.8453 0.9194
No log 1.5490 158 0.9441 0.4885 0.9441 0.9716
No log 1.5686 160 0.9449 0.5078 0.9449 0.9720
No log 1.5882 162 0.7949 0.5134 0.7949 0.8916
No log 1.6078 164 0.7802 0.5253 0.7802 0.8833
No log 1.6275 166 0.8393 0.4787 0.8393 0.9161
No log 1.6471 168 0.7924 0.5509 0.7924 0.8902
No log 1.6667 170 0.7631 0.4267 0.7631 0.8736
No log 1.6863 172 0.8200 0.4038 0.8200 0.9055
No log 1.7059 174 0.8024 0.4143 0.8024 0.8958
No log 1.7255 176 0.7571 0.4124 0.7571 0.8701
No log 1.7451 178 0.7459 0.4903 0.7459 0.8637
No log 1.7647 180 0.7624 0.4198 0.7624 0.8732
No log 1.7843 182 0.7796 0.4606 0.7796 0.8830
No log 1.8039 184 0.7863 0.3902 0.7863 0.8867
No log 1.8235 186 0.7581 0.4606 0.7581 0.8707
No log 1.8431 188 0.7175 0.4391 0.7175 0.8471
No log 1.8627 190 0.6861 0.4503 0.6861 0.8283
No log 1.8824 192 0.6692 0.4416 0.6692 0.8181
No log 1.9020 194 0.6386 0.5758 0.6386 0.7991
No log 1.9216 196 0.6404 0.4794 0.6404 0.8002
No log 1.9412 198 0.6434 0.4422 0.6434 0.8021
No log 1.9608 200 0.6764 0.4186 0.6764 0.8224
No log 1.9804 202 0.7017 0.4054 0.7017 0.8377
No log 2.0 204 0.7397 0.4038 0.7397 0.8601
No log 2.0196 206 0.7107 0.4433 0.7107 0.8430
No log 2.0392 208 0.7232 0.4724 0.7232 0.8504
No log 2.0588 210 0.8160 0.4771 0.8160 0.9033
No log 2.0784 212 0.7856 0.4771 0.7856 0.8864
No log 2.0980 214 0.7010 0.4813 0.7010 0.8372
No log 2.1176 216 0.6708 0.4903 0.6708 0.8190
No log 2.1373 218 0.6698 0.5605 0.6698 0.8184
No log 2.1569 220 0.6663 0.5509 0.6663 0.8163
No log 2.1765 222 0.6611 0.5771 0.6611 0.8131
No log 2.1961 224 0.6549 0.5288 0.6549 0.8093
No log 2.2157 226 0.6551 0.5208 0.6551 0.8094
No log 2.2353 228 0.6757 0.5103 0.6757 0.8220
No log 2.2549 230 0.7199 0.4602 0.7199 0.8485
No log 2.2745 232 0.7185 0.4726 0.7185 0.8476
No log 2.2941 234 0.6726 0.5067 0.6726 0.8201
No log 2.3137 236 0.6853 0.5640 0.6853 0.8279
No log 2.3333 238 0.7693 0.5579 0.7693 0.8771
No log 2.3529 240 0.7670 0.5598 0.7670 0.8758
No log 2.3725 242 0.7339 0.5244 0.7339 0.8567
No log 2.3922 244 0.7896 0.4143 0.7896 0.8886
No log 2.4118 246 0.7923 0.4275 0.7923 0.8901
No log 2.4314 248 0.7561 0.5019 0.7561 0.8695
No log 2.4510 250 0.7731 0.4009 0.7731 0.8792
No log 2.4706 252 0.8671 0.5014 0.8671 0.9312
No log 2.4902 254 0.8950 0.4397 0.8950 0.9461
No log 2.5098 256 0.8284 0.4283 0.8284 0.9102
No log 2.5294 258 0.7933 0.4819 0.7933 0.8907
No log 2.5490 260 0.7524 0.5093 0.7524 0.8674
No log 2.5686 262 0.7630 0.5051 0.7630 0.8735
No log 2.5882 264 0.7771 0.5051 0.7771 0.8816
No log 2.6078 266 0.7654 0.4815 0.7654 0.8749
No log 2.6275 268 0.7621 0.5093 0.7621 0.8730
No log 2.6471 270 0.7314 0.5463 0.7314 0.8552
No log 2.6667 272 0.6931 0.5273 0.6931 0.8325
No log 2.6863 274 0.6994 0.6167 0.6994 0.8363
No log 2.7059 276 0.7370 0.5707 0.7370 0.8585
No log 2.7255 278 0.7205 0.5709 0.7205 0.8488
No log 2.7451 280 0.7258 0.5113 0.7258 0.8519
No log 2.7647 282 0.7274 0.5245 0.7274 0.8529
No log 2.7843 284 0.7044 0.5391 0.7044 0.8393
No log 2.8039 286 0.6914 0.4035 0.6914 0.8315
No log 2.8235 288 0.7647 0.4484 0.7647 0.8744
No log 2.8431 290 0.8021 0.5027 0.8021 0.8956
No log 2.8627 292 0.7749 0.5019 0.7749 0.8803
No log 2.8824 294 0.7324 0.4769 0.7324 0.8558
No log 2.9020 296 0.6821 0.4692 0.6821 0.8259
No log 2.9216 298 0.6830 0.5721 0.6830 0.8264
No log 2.9412 300 0.7206 0.5559 0.7206 0.8489
No log 2.9608 302 0.7080 0.4373 0.7080 0.8414
No log 2.9804 304 0.7858 0.4217 0.7858 0.8865
No log 3.0 306 1.0795 0.3519 1.0795 1.0390
No log 3.0196 308 1.2764 0.2354 1.2764 1.1298
No log 3.0392 310 1.1874 0.2472 1.1874 1.0897
No log 3.0588 312 0.9276 0.4256 0.9276 0.9631
No log 3.0784 314 0.6973 0.4285 0.6973 0.8351
No log 3.0980 316 0.6850 0.4887 0.6850 0.8276
No log 3.1176 318 0.6795 0.5627 0.6795 0.8243
No log 3.1373 320 0.6703 0.4797 0.6703 0.8187
No log 3.1569 322 0.7107 0.4444 0.7107 0.8430
No log 3.1765 324 0.7412 0.4604 0.7412 0.8609
No log 3.1961 326 0.7294 0.4321 0.7294 0.8541
No log 3.2157 328 0.7153 0.4038 0.7153 0.8457
No log 3.2353 330 0.7240 0.4019 0.7240 0.8509
No log 3.2549 332 0.7512 0.4 0.7512 0.8667
No log 3.2745 334 0.7641 0.3730 0.7641 0.8741
No log 3.2941 336 0.7799 0.4041 0.7799 0.8831
No log 3.3137 338 0.8152 0.3811 0.8152 0.9029
No log 3.3333 340 0.7548 0.4339 0.7548 0.8688
No log 3.3529 342 0.6685 0.4444 0.6685 0.8176
No log 3.3725 344 0.6536 0.5391 0.6536 0.8084
No log 3.3922 346 0.6953 0.6188 0.6953 0.8339
No log 3.4118 348 0.7388 0.5610 0.7388 0.8595
No log 3.4314 350 0.7196 0.5192 0.7196 0.8483
No log 3.4510 352 0.7159 0.5108 0.7159 0.8461
No log 3.4706 354 0.6804 0.4493 0.6804 0.8249
No log 3.4902 356 0.6839 0.4658 0.6839 0.8270
No log 3.5098 358 0.6700 0.4658 0.6700 0.8185
No log 3.5294 360 0.6529 0.5469 0.6529 0.8080
No log 3.5490 362 0.6380 0.5469 0.6380 0.7987
No log 3.5686 364 0.6423 0.5932 0.6423 0.8015
No log 3.5882 366 0.6638 0.5585 0.6638 0.8148
No log 3.6078 368 0.6824 0.5067 0.6824 0.8261
No log 3.6275 370 0.7695 0.4726 0.7695 0.8772
No log 3.6471 372 0.8176 0.4217 0.8176 0.9042
No log 3.6667 374 0.7425 0.5103 0.7425 0.8617
No log 3.6863 376 0.6617 0.5549 0.6617 0.8134
No log 3.7059 378 0.6582 0.5912 0.6582 0.8113
No log 3.7255 380 0.6399 0.6112 0.6399 0.7999
No log 3.7451 382 0.6251 0.5679 0.6251 0.7906
No log 3.7647 384 0.7408 0.5019 0.7408 0.8607
No log 3.7843 386 0.9449 0.4246 0.9449 0.9721
No log 3.8039 388 0.9264 0.4230 0.9264 0.9625
No log 3.8235 390 0.7713 0.4755 0.7713 0.8782
No log 3.8431 392 0.6509 0.5549 0.6509 0.8068
No log 3.8627 394 0.6812 0.6032 0.6812 0.8254
No log 3.8824 396 0.7133 0.5552 0.7133 0.8445
No log 3.9020 398 0.6913 0.5627 0.6913 0.8314
No log 3.9216 400 0.6770 0.5082 0.6770 0.8228
No log 3.9412 402 0.6823 0.5118 0.6823 0.8260
No log 3.9608 404 0.7429 0.4784 0.7429 0.8619
No log 3.9804 406 0.7808 0.5045 0.7808 0.8836
No log 4.0 408 0.7385 0.5033 0.7385 0.8593
No log 4.0196 410 0.6955 0.5131 0.6955 0.8340
No log 4.0392 412 0.6899 0.5131 0.6899 0.8306
No log 4.0588 414 0.6895 0.5243 0.6895 0.8304
No log 4.0784 416 0.6659 0.4998 0.6659 0.8160
No log 4.0980 418 0.6552 0.5359 0.6552 0.8094
No log 4.1176 420 0.6684 0.5359 0.6684 0.8175
No log 4.1373 422 0.6845 0.4990 0.6845 0.8273
No log 4.1569 424 0.6766 0.4998 0.6766 0.8225
No log 4.1765 426 0.6460 0.5359 0.6460 0.8037
No log 4.1961 428 0.6289 0.5185 0.6289 0.7930
No log 4.2157 430 0.6335 0.5212 0.6335 0.7959
No log 4.2353 432 0.6469 0.5243 0.6469 0.8043
No log 4.2549 434 0.7813 0.5148 0.7813 0.8839
No log 4.2745 436 0.8211 0.5387 0.8211 0.9062
No log 4.2941 438 0.8183 0.5387 0.8183 0.9046
No log 4.3137 440 0.6783 0.5243 0.6783 0.8236
No log 4.3333 442 0.6654 0.5467 0.6654 0.8157
No log 4.3529 444 0.7077 0.5346 0.7077 0.8412
No log 4.3725 446 0.7162 0.4767 0.7162 0.8463
No log 4.3922 448 0.7674 0.4041 0.7674 0.8760
No log 4.4118 450 0.8452 0.4686 0.8452 0.9193
No log 4.4314 452 0.8673 0.4701 0.8673 0.9313
No log 4.4510 454 0.7716 0.5148 0.7716 0.8784
No log 4.4706 456 0.6402 0.5108 0.6402 0.8001
No log 4.4902 458 0.6406 0.6102 0.6406 0.8004
No log 4.5098 460 0.6672 0.6468 0.6672 0.8168
No log 4.5294 462 0.6181 0.6167 0.6181 0.7862
No log 4.5490 464 0.6241 0.5879 0.6241 0.7900
No log 4.5686 466 0.6806 0.5794 0.6806 0.8250
No log 4.5882 468 0.7098 0.4755 0.7098 0.8425
No log 4.6078 470 0.7035 0.5118 0.7035 0.8387
No log 4.6275 472 0.6811 0.5082 0.6811 0.8253
No log 4.6471 474 0.7085 0.5472 0.7085 0.8417
No log 4.6667 476 0.7784 0.5470 0.7784 0.8823
No log 4.6863 478 0.7365 0.5833 0.7365 0.8582
No log 4.7059 480 0.6936 0.5168 0.6936 0.8328
No log 4.7255 482 0.7786 0.4505 0.7786 0.8824
No log 4.7451 484 0.7979 0.4526 0.7979 0.8932
No log 4.7647 486 0.7889 0.4526 0.7889 0.8882
No log 4.7843 488 0.7067 0.5419 0.7067 0.8407
No log 4.8039 490 0.6984 0.5516 0.6984 0.8357
No log 4.8235 492 0.7007 0.5516 0.7007 0.8371
No log 4.8431 494 0.7066 0.5419 0.7066 0.8406
No log 4.8627 496 0.7262 0.5063 0.7262 0.8522
No log 4.8824 498 0.7279 0.5063 0.7279 0.8532
0.2578 4.9020 500 0.7196 0.5153 0.7196 0.8483
0.2578 4.9216 502 0.7408 0.4781 0.7408 0.8607
0.2578 4.9412 504 0.8056 0.4303 0.8056 0.8976
0.2578 4.9608 506 0.9371 0.3791 0.9371 0.9680
0.2578 4.9804 508 1.0167 0.2707 1.0167 1.0083
0.2578 5.0 510 1.0239 0.1951 1.0239 1.0119

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k20_task5_organization

Finetuned
(4222)
this model