ArabicNewSplits8_FineTuningAraBERT_noAug_task6_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8340
- Qwk: 0.5723
- Mse: 0.8340
- Rmse: 0.9132
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
---|---|---|---|---|---|---|
No log | 0.6667 | 2 | 3.4526 | -0.0317 | 3.4526 | 1.8581 |
No log | 1.3333 | 4 | 2.2343 | 0.0404 | 2.2343 | 1.4948 |
No log | 2.0 | 6 | 1.2101 | 0.2208 | 1.2101 | 1.1001 |
No log | 2.6667 | 8 | 0.9193 | 0.0 | 0.9193 | 0.9588 |
No log | 3.3333 | 10 | 1.0335 | 0.0335 | 1.0335 | 1.0166 |
No log | 4.0 | 12 | 0.8504 | 0.0218 | 0.8504 | 0.9222 |
No log | 4.6667 | 14 | 0.7086 | 0.4351 | 0.7086 | 0.8418 |
No log | 5.3333 | 16 | 0.7541 | 0.4673 | 0.7541 | 0.8684 |
No log | 6.0 | 18 | 0.8298 | 0.1506 | 0.8298 | 0.9110 |
No log | 6.6667 | 20 | 0.7384 | 0.3826 | 0.7384 | 0.8593 |
No log | 7.3333 | 22 | 0.7834 | 0.3974 | 0.7834 | 0.8851 |
No log | 8.0 | 24 | 0.8762 | 0.4259 | 0.8762 | 0.9361 |
No log | 8.6667 | 26 | 1.0062 | 0.4243 | 1.0062 | 1.0031 |
No log | 9.3333 | 28 | 1.1543 | 0.3660 | 1.1543 | 1.0744 |
No log | 10.0 | 30 | 1.1532 | 0.3431 | 1.1532 | 1.0739 |
No log | 10.6667 | 32 | 0.9774 | 0.4068 | 0.9774 | 0.9886 |
No log | 11.3333 | 34 | 0.8913 | 0.3946 | 0.8913 | 0.9441 |
No log | 12.0 | 36 | 0.9398 | 0.4270 | 0.9398 | 0.9694 |
No log | 12.6667 | 38 | 0.9502 | 0.4216 | 0.9502 | 0.9748 |
No log | 13.3333 | 40 | 0.8479 | 0.4825 | 0.8479 | 0.9208 |
No log | 14.0 | 42 | 0.8696 | 0.4985 | 0.8696 | 0.9325 |
No log | 14.6667 | 44 | 0.8998 | 0.4798 | 0.8998 | 0.9486 |
No log | 15.3333 | 46 | 1.0532 | 0.4270 | 1.0532 | 1.0263 |
No log | 16.0 | 48 | 1.2459 | 0.4253 | 1.2459 | 1.1162 |
No log | 16.6667 | 50 | 1.1062 | 0.4235 | 1.1062 | 1.0518 |
No log | 17.3333 | 52 | 0.9501 | 0.4368 | 0.9501 | 0.9747 |
No log | 18.0 | 54 | 0.8000 | 0.5703 | 0.8000 | 0.8944 |
No log | 18.6667 | 56 | 0.7786 | 0.5422 | 0.7786 | 0.8824 |
No log | 19.3333 | 58 | 0.8515 | 0.6029 | 0.8515 | 0.9228 |
No log | 20.0 | 60 | 1.0081 | 0.4269 | 1.0081 | 1.0040 |
No log | 20.6667 | 62 | 1.0674 | 0.4284 | 1.0674 | 1.0331 |
No log | 21.3333 | 64 | 1.0013 | 0.4189 | 1.0013 | 1.0006 |
No log | 22.0 | 66 | 0.8419 | 0.5115 | 0.8419 | 0.9175 |
No log | 22.6667 | 68 | 0.7435 | 0.6547 | 0.7435 | 0.8623 |
No log | 23.3333 | 70 | 0.7719 | 0.6163 | 0.7719 | 0.8786 |
No log | 24.0 | 72 | 0.7651 | 0.5351 | 0.7651 | 0.8747 |
No log | 24.6667 | 74 | 0.9053 | 0.4363 | 0.9053 | 0.9515 |
No log | 25.3333 | 76 | 1.0090 | 0.4974 | 1.0090 | 1.0045 |
No log | 26.0 | 78 | 0.9488 | 0.5155 | 0.9488 | 0.9741 |
No log | 26.6667 | 80 | 0.7880 | 0.5981 | 0.7880 | 0.8877 |
No log | 27.3333 | 82 | 0.6494 | 0.6654 | 0.6494 | 0.8058 |
No log | 28.0 | 84 | 0.6348 | 0.6444 | 0.6348 | 0.7967 |
No log | 28.6667 | 86 | 0.6857 | 0.6173 | 0.6857 | 0.8281 |
No log | 29.3333 | 88 | 0.8317 | 0.5460 | 0.8317 | 0.9120 |
No log | 30.0 | 90 | 0.8303 | 0.5545 | 0.8303 | 0.9112 |
No log | 30.6667 | 92 | 0.8252 | 0.5981 | 0.8252 | 0.9084 |
No log | 31.3333 | 94 | 0.8322 | 0.5981 | 0.8322 | 0.9123 |
No log | 32.0 | 96 | 0.9369 | 0.4573 | 0.9369 | 0.9680 |
No log | 32.6667 | 98 | 0.8769 | 0.4764 | 0.8769 | 0.9364 |
No log | 33.3333 | 100 | 0.6936 | 0.6296 | 0.6936 | 0.8328 |
No log | 34.0 | 102 | 0.6399 | 0.6264 | 0.6399 | 0.8000 |
No log | 34.6667 | 104 | 0.6302 | 0.6351 | 0.6302 | 0.7938 |
No log | 35.3333 | 106 | 0.6795 | 0.6403 | 0.6795 | 0.8243 |
No log | 36.0 | 108 | 0.8245 | 0.5534 | 0.8245 | 0.9080 |
No log | 36.6667 | 110 | 0.9075 | 0.5082 | 0.9075 | 0.9526 |
No log | 37.3333 | 112 | 0.8475 | 0.5747 | 0.8475 | 0.9206 |
No log | 38.0 | 114 | 0.8374 | 0.5603 | 0.8374 | 0.9151 |
No log | 38.6667 | 116 | 0.8240 | 0.5766 | 0.8240 | 0.9077 |
No log | 39.3333 | 118 | 0.7487 | 0.5591 | 0.7487 | 0.8653 |
No log | 40.0 | 120 | 0.7509 | 0.5641 | 0.7509 | 0.8666 |
No log | 40.6667 | 122 | 0.7738 | 0.5610 | 0.7738 | 0.8797 |
No log | 41.3333 | 124 | 0.8370 | 0.4535 | 0.8370 | 0.9149 |
No log | 42.0 | 126 | 0.8331 | 0.4494 | 0.8331 | 0.9128 |
No log | 42.6667 | 128 | 0.7992 | 0.5462 | 0.7992 | 0.8940 |
No log | 43.3333 | 130 | 0.7800 | 0.5809 | 0.7800 | 0.8832 |
No log | 44.0 | 132 | 0.8098 | 0.5543 | 0.8098 | 0.8999 |
No log | 44.6667 | 134 | 0.8243 | 0.5371 | 0.8243 | 0.9079 |
No log | 45.3333 | 136 | 0.7971 | 0.5388 | 0.7971 | 0.8928 |
No log | 46.0 | 138 | 0.7945 | 0.5040 | 0.7945 | 0.8914 |
No log | 46.6667 | 140 | 0.7726 | 0.4768 | 0.7726 | 0.8790 |
No log | 47.3333 | 142 | 0.7802 | 0.5077 | 0.7802 | 0.8833 |
No log | 48.0 | 144 | 0.8691 | 0.4032 | 0.8691 | 0.9323 |
No log | 48.6667 | 146 | 0.8543 | 0.4858 | 0.8543 | 0.9243 |
No log | 49.3333 | 148 | 0.7738 | 0.4584 | 0.7738 | 0.8797 |
No log | 50.0 | 150 | 0.6812 | 0.4978 | 0.6812 | 0.8254 |
No log | 50.6667 | 152 | 0.6483 | 0.5688 | 0.6483 | 0.8051 |
No log | 51.3333 | 154 | 0.6532 | 0.5983 | 0.6532 | 0.8082 |
No log | 52.0 | 156 | 0.6650 | 0.5983 | 0.6650 | 0.8154 |
No log | 52.6667 | 158 | 0.7280 | 0.5670 | 0.7280 | 0.8532 |
No log | 53.3333 | 160 | 0.7748 | 0.5678 | 0.7748 | 0.8802 |
No log | 54.0 | 162 | 0.7402 | 0.5983 | 0.7402 | 0.8603 |
No log | 54.6667 | 164 | 0.7146 | 0.5891 | 0.7146 | 0.8453 |
No log | 55.3333 | 166 | 0.7148 | 0.5985 | 0.7148 | 0.8455 |
No log | 56.0 | 168 | 0.7414 | 0.6164 | 0.7414 | 0.8610 |
No log | 56.6667 | 170 | 0.7514 | 0.6071 | 0.7514 | 0.8669 |
No log | 57.3333 | 172 | 0.7558 | 0.6071 | 0.7558 | 0.8694 |
No log | 58.0 | 174 | 0.7710 | 0.5901 | 0.7710 | 0.8781 |
No log | 58.6667 | 176 | 0.7183 | 0.6200 | 0.7183 | 0.8475 |
No log | 59.3333 | 178 | 0.6501 | 0.6380 | 0.6501 | 0.8063 |
No log | 60.0 | 180 | 0.6183 | 0.6236 | 0.6183 | 0.7863 |
No log | 60.6667 | 182 | 0.6362 | 0.6181 | 0.6362 | 0.7976 |
No log | 61.3333 | 184 | 0.7140 | 0.5981 | 0.7140 | 0.8450 |
No log | 62.0 | 186 | 0.8421 | 0.5427 | 0.8421 | 0.9177 |
No log | 62.6667 | 188 | 0.9839 | 0.4686 | 0.9839 | 0.9919 |
No log | 63.3333 | 190 | 1.0033 | 0.4535 | 1.0033 | 1.0017 |
No log | 64.0 | 192 | 0.9185 | 0.4976 | 0.9185 | 0.9584 |
No log | 64.6667 | 194 | 0.8244 | 0.5571 | 0.8244 | 0.9080 |
No log | 65.3333 | 196 | 0.7345 | 0.6078 | 0.7345 | 0.8571 |
No log | 66.0 | 198 | 0.7035 | 0.5934 | 0.7035 | 0.8388 |
No log | 66.6667 | 200 | 0.7409 | 0.5589 | 0.7409 | 0.8608 |
No log | 67.3333 | 202 | 0.8480 | 0.5709 | 0.8480 | 0.9208 |
No log | 68.0 | 204 | 0.9548 | 0.5001 | 0.9548 | 0.9771 |
No log | 68.6667 | 206 | 1.0006 | 0.4646 | 1.0006 | 1.0003 |
No log | 69.3333 | 208 | 1.0472 | 0.4482 | 1.0472 | 1.0233 |
No log | 70.0 | 210 | 1.0085 | 0.4646 | 1.0085 | 1.0042 |
No log | 70.6667 | 212 | 0.9211 | 0.5296 | 0.9211 | 0.9597 |
No log | 71.3333 | 214 | 0.8564 | 0.5678 | 0.8564 | 0.9254 |
No log | 72.0 | 216 | 0.8190 | 0.5764 | 0.8190 | 0.9050 |
No log | 72.6667 | 218 | 0.8210 | 0.5678 | 0.8210 | 0.9061 |
No log | 73.3333 | 220 | 0.8126 | 0.5764 | 0.8126 | 0.9014 |
No log | 74.0 | 222 | 0.8188 | 0.5764 | 0.8188 | 0.9049 |
No log | 74.6667 | 224 | 0.8274 | 0.5497 | 0.8274 | 0.9096 |
No log | 75.3333 | 226 | 0.8370 | 0.5497 | 0.8370 | 0.9149 |
No log | 76.0 | 228 | 0.8657 | 0.5001 | 0.8657 | 0.9304 |
No log | 76.6667 | 230 | 0.8825 | 0.5001 | 0.8825 | 0.9394 |
No log | 77.3333 | 232 | 0.8894 | 0.5244 | 0.8894 | 0.9431 |
No log | 78.0 | 234 | 0.8879 | 0.5244 | 0.8879 | 0.9423 |
No log | 78.6667 | 236 | 0.8604 | 0.5427 | 0.8604 | 0.9276 |
No log | 79.3333 | 238 | 0.8326 | 0.5497 | 0.8326 | 0.9124 |
No log | 80.0 | 240 | 0.7870 | 0.5678 | 0.7870 | 0.8872 |
No log | 80.6667 | 242 | 0.7509 | 0.5764 | 0.7509 | 0.8665 |
No log | 81.3333 | 244 | 0.7092 | 0.5935 | 0.7092 | 0.8421 |
No log | 82.0 | 246 | 0.6965 | 0.6074 | 0.6965 | 0.8346 |
No log | 82.6667 | 248 | 0.6998 | 0.5935 | 0.6998 | 0.8366 |
No log | 83.3333 | 250 | 0.7183 | 0.5935 | 0.7183 | 0.8475 |
No log | 84.0 | 252 | 0.7599 | 0.5981 | 0.7599 | 0.8717 |
No log | 84.6667 | 254 | 0.8164 | 0.5678 | 0.8164 | 0.9036 |
No log | 85.3333 | 256 | 0.8851 | 0.5678 | 0.8851 | 0.9408 |
No log | 86.0 | 258 | 0.9300 | 0.5427 | 0.9300 | 0.9644 |
No log | 86.6667 | 260 | 0.9437 | 0.5234 | 0.9437 | 0.9714 |
No log | 87.3333 | 262 | 0.9368 | 0.5234 | 0.9368 | 0.9679 |
No log | 88.0 | 264 | 0.9105 | 0.5427 | 0.9105 | 0.9542 |
No log | 88.6667 | 266 | 0.8754 | 0.5427 | 0.8754 | 0.9356 |
No log | 89.3333 | 268 | 0.8341 | 0.5497 | 0.8341 | 0.9133 |
No log | 90.0 | 270 | 0.7874 | 0.5678 | 0.7874 | 0.8874 |
No log | 90.6667 | 272 | 0.7526 | 0.5892 | 0.7526 | 0.8675 |
No log | 91.3333 | 274 | 0.7188 | 0.5758 | 0.7188 | 0.8478 |
No log | 92.0 | 276 | 0.7026 | 0.5894 | 0.7026 | 0.8382 |
No log | 92.6667 | 278 | 0.7010 | 0.5937 | 0.7010 | 0.8372 |
No log | 93.3333 | 280 | 0.7121 | 0.5847 | 0.7121 | 0.8439 |
No log | 94.0 | 282 | 0.7318 | 0.5892 | 0.7318 | 0.8555 |
No log | 94.6667 | 284 | 0.7518 | 0.5709 | 0.7518 | 0.8671 |
No log | 95.3333 | 286 | 0.7656 | 0.5709 | 0.7656 | 0.8750 |
No log | 96.0 | 288 | 0.7820 | 0.5497 | 0.7820 | 0.8843 |
No log | 96.6667 | 290 | 0.7991 | 0.5497 | 0.7991 | 0.8939 |
No log | 97.3333 | 292 | 0.8130 | 0.5497 | 0.8130 | 0.9017 |
No log | 98.0 | 294 | 0.8224 | 0.5497 | 0.8224 | 0.9069 |
No log | 98.6667 | 296 | 0.8284 | 0.5723 | 0.8284 | 0.9101 |
No log | 99.3333 | 298 | 0.8328 | 0.5723 | 0.8328 | 0.9126 |
No log | 100.0 | 300 | 0.8340 | 0.5723 | 0.8340 | 0.9132 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for MayBashendy/ArabicNewSplits8_FineTuningAraBERT_noAug_task6_organization
Base model
aubmindlab/bert-base-arabertv02