ArabicNewSplits8_FineTuningAraBERT_noAug_task5_organization
This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.8245
- Qwk: 0.5349
- Mse: 0.8245
- Rmse: 0.9080
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
---|---|---|---|---|---|---|
No log | 0.6667 | 2 | 3.2701 | -0.0272 | 3.2701 | 1.8083 |
No log | 1.3333 | 4 | 2.0230 | 0.0007 | 2.0230 | 1.4223 |
No log | 2.0 | 6 | 1.0951 | 0.1104 | 1.0951 | 1.0465 |
No log | 2.6667 | 8 | 1.0161 | 0.1628 | 1.0161 | 1.0080 |
No log | 3.3333 | 10 | 1.0953 | 0.1021 | 1.0953 | 1.0466 |
No log | 4.0 | 12 | 1.1892 | 0.1021 | 1.1892 | 1.0905 |
No log | 4.6667 | 14 | 1.0553 | 0.1776 | 1.0553 | 1.0273 |
No log | 5.3333 | 16 | 0.9292 | 0.2898 | 0.9292 | 0.9640 |
No log | 6.0 | 18 | 0.7888 | 0.3714 | 0.7888 | 0.8882 |
No log | 6.6667 | 20 | 0.7374 | 0.3963 | 0.7374 | 0.8587 |
No log | 7.3333 | 22 | 0.7826 | 0.3813 | 0.7826 | 0.8847 |
No log | 8.0 | 24 | 0.8569 | 0.4116 | 0.8569 | 0.9257 |
No log | 8.6667 | 26 | 0.9253 | 0.5013 | 0.9253 | 0.9620 |
No log | 9.3333 | 28 | 0.7979 | 0.5106 | 0.7979 | 0.8933 |
No log | 10.0 | 30 | 0.7661 | 0.4928 | 0.7661 | 0.8753 |
No log | 10.6667 | 32 | 0.8839 | 0.5284 | 0.8839 | 0.9401 |
No log | 11.3333 | 34 | 0.8728 | 0.4542 | 0.8728 | 0.9342 |
No log | 12.0 | 36 | 0.8213 | 0.4778 | 0.8213 | 0.9063 |
No log | 12.6667 | 38 | 0.6915 | 0.5226 | 0.6915 | 0.8316 |
No log | 13.3333 | 40 | 0.6675 | 0.5472 | 0.6675 | 0.8170 |
No log | 14.0 | 42 | 0.6399 | 0.5100 | 0.6399 | 0.8000 |
No log | 14.6667 | 44 | 0.6780 | 0.5783 | 0.6780 | 0.8234 |
No log | 15.3333 | 46 | 0.6829 | 0.5861 | 0.6829 | 0.8264 |
No log | 16.0 | 48 | 0.6859 | 0.5177 | 0.6859 | 0.8282 |
No log | 16.6667 | 50 | 0.7491 | 0.5374 | 0.7491 | 0.8655 |
No log | 17.3333 | 52 | 0.7679 | 0.5159 | 0.7679 | 0.8763 |
No log | 18.0 | 54 | 0.7048 | 0.4740 | 0.7048 | 0.8395 |
No log | 18.6667 | 56 | 0.6557 | 0.5296 | 0.6557 | 0.8097 |
No log | 19.3333 | 58 | 0.6780 | 0.5760 | 0.6780 | 0.8234 |
No log | 20.0 | 60 | 0.6692 | 0.5760 | 0.6692 | 0.8181 |
No log | 20.6667 | 62 | 0.6269 | 0.5304 | 0.6269 | 0.7918 |
No log | 21.3333 | 64 | 0.6342 | 0.5522 | 0.6342 | 0.7964 |
No log | 22.0 | 66 | 0.7431 | 0.6271 | 0.7431 | 0.8620 |
No log | 22.6667 | 68 | 0.7428 | 0.6271 | 0.7428 | 0.8619 |
No log | 23.3333 | 70 | 0.6956 | 0.5706 | 0.6956 | 0.8340 |
No log | 24.0 | 72 | 0.6998 | 0.5707 | 0.6998 | 0.8365 |
No log | 24.6667 | 74 | 0.6652 | 0.5458 | 0.6652 | 0.8156 |
No log | 25.3333 | 76 | 0.6259 | 0.5055 | 0.6259 | 0.7911 |
No log | 26.0 | 78 | 0.6424 | 0.5304 | 0.6424 | 0.8015 |
No log | 26.6667 | 80 | 0.8166 | 0.6271 | 0.8166 | 0.9037 |
No log | 27.3333 | 82 | 0.9483 | 0.5556 | 0.9483 | 0.9738 |
No log | 28.0 | 84 | 0.7602 | 0.6240 | 0.7602 | 0.8719 |
No log | 28.6667 | 86 | 0.6924 | 0.5761 | 0.6924 | 0.8321 |
No log | 29.3333 | 88 | 0.6957 | 0.5708 | 0.6957 | 0.8341 |
No log | 30.0 | 90 | 0.7267 | 0.5758 | 0.7267 | 0.8525 |
No log | 30.6667 | 92 | 0.8249 | 0.5546 | 0.8249 | 0.9082 |
No log | 31.3333 | 94 | 0.9042 | 0.5498 | 0.9042 | 0.9509 |
No log | 32.0 | 96 | 0.8384 | 0.5549 | 0.8384 | 0.9156 |
No log | 32.6667 | 98 | 0.6593 | 0.5632 | 0.6593 | 0.8120 |
No log | 33.3333 | 100 | 0.6320 | 0.5647 | 0.6320 | 0.7950 |
No log | 34.0 | 102 | 0.6310 | 0.5488 | 0.6310 | 0.7944 |
No log | 34.6667 | 104 | 0.6450 | 0.5712 | 0.6450 | 0.8031 |
No log | 35.3333 | 106 | 0.8915 | 0.5318 | 0.8915 | 0.9442 |
No log | 36.0 | 108 | 1.2787 | 0.3787 | 1.2787 | 1.1308 |
No log | 36.6667 | 110 | 1.3123 | 0.3184 | 1.3123 | 1.1456 |
No log | 37.3333 | 112 | 1.1050 | 0.3462 | 1.1050 | 1.0512 |
No log | 38.0 | 114 | 0.8975 | 0.4934 | 0.8975 | 0.9474 |
No log | 38.6667 | 116 | 0.6823 | 0.5737 | 0.6823 | 0.8260 |
No log | 39.3333 | 118 | 0.6255 | 0.5775 | 0.6255 | 0.7909 |
No log | 40.0 | 120 | 0.6292 | 0.5474 | 0.6292 | 0.7932 |
No log | 40.6667 | 122 | 0.6513 | 0.5578 | 0.6513 | 0.8070 |
No log | 41.3333 | 124 | 0.8106 | 0.5700 | 0.8106 | 0.9003 |
No log | 42.0 | 126 | 1.0063 | 0.5181 | 1.0063 | 1.0031 |
No log | 42.6667 | 128 | 1.0040 | 0.5338 | 1.0040 | 1.0020 |
No log | 43.3333 | 130 | 0.8358 | 0.5524 | 0.8358 | 0.9142 |
No log | 44.0 | 132 | 0.7352 | 0.5071 | 0.7352 | 0.8575 |
No log | 44.6667 | 134 | 0.7255 | 0.5543 | 0.7255 | 0.8518 |
No log | 45.3333 | 136 | 0.6951 | 0.5708 | 0.6951 | 0.8337 |
No log | 46.0 | 138 | 0.7018 | 0.5625 | 0.7018 | 0.8377 |
No log | 46.6667 | 140 | 0.7045 | 0.5457 | 0.7045 | 0.8394 |
No log | 47.3333 | 142 | 0.7349 | 0.5457 | 0.7349 | 0.8573 |
No log | 48.0 | 144 | 0.7991 | 0.5210 | 0.7991 | 0.8939 |
No log | 48.6667 | 146 | 0.7839 | 0.5210 | 0.7839 | 0.8854 |
No log | 49.3333 | 148 | 0.7769 | 0.5100 | 0.7769 | 0.8814 |
No log | 50.0 | 150 | 0.7198 | 0.5254 | 0.7198 | 0.8484 |
No log | 50.6667 | 152 | 0.7114 | 0.5424 | 0.7114 | 0.8434 |
No log | 51.3333 | 154 | 0.7177 | 0.5430 | 0.7177 | 0.8472 |
No log | 52.0 | 156 | 0.7458 | 0.5071 | 0.7458 | 0.8636 |
No log | 52.6667 | 158 | 0.8203 | 0.5308 | 0.8203 | 0.9057 |
No log | 53.3333 | 160 | 0.8864 | 0.5521 | 0.8864 | 0.9415 |
No log | 54.0 | 162 | 0.8570 | 0.5652 | 0.8570 | 0.9258 |
No log | 54.6667 | 164 | 0.7892 | 0.5281 | 0.7892 | 0.8884 |
No log | 55.3333 | 166 | 0.7416 | 0.5041 | 0.7416 | 0.8612 |
No log | 56.0 | 168 | 0.7590 | 0.5057 | 0.7590 | 0.8712 |
No log | 56.6667 | 170 | 0.8131 | 0.5100 | 0.8131 | 0.9017 |
No log | 57.3333 | 172 | 0.8186 | 0.5087 | 0.8186 | 0.9047 |
No log | 58.0 | 174 | 0.8255 | 0.5004 | 0.8255 | 0.9086 |
No log | 58.6667 | 176 | 0.7683 | 0.5182 | 0.7683 | 0.8765 |
No log | 59.3333 | 178 | 0.7228 | 0.5845 | 0.7228 | 0.8502 |
No log | 60.0 | 180 | 0.7169 | 0.5682 | 0.7169 | 0.8467 |
No log | 60.6667 | 182 | 0.7477 | 0.5264 | 0.7477 | 0.8647 |
No log | 61.3333 | 184 | 0.7659 | 0.5284 | 0.7659 | 0.8751 |
No log | 62.0 | 186 | 0.7907 | 0.5300 | 0.7907 | 0.8892 |
No log | 62.6667 | 188 | 0.7748 | 0.5300 | 0.7748 | 0.8802 |
No log | 63.3333 | 190 | 0.7277 | 0.5435 | 0.7277 | 0.8530 |
No log | 64.0 | 192 | 0.6938 | 0.5682 | 0.6938 | 0.8330 |
No log | 64.6667 | 194 | 0.7109 | 0.5537 | 0.7109 | 0.8432 |
No log | 65.3333 | 196 | 0.7568 | 0.4872 | 0.7568 | 0.8699 |
No log | 66.0 | 198 | 0.8124 | 0.5328 | 0.8124 | 0.9013 |
No log | 66.6667 | 200 | 0.8411 | 0.5328 | 0.8411 | 0.9171 |
No log | 67.3333 | 202 | 0.8467 | 0.5622 | 0.8467 | 0.9201 |
No log | 68.0 | 204 | 0.8666 | 0.5465 | 0.8666 | 0.9309 |
No log | 68.6667 | 206 | 0.8513 | 0.5355 | 0.8513 | 0.9227 |
No log | 69.3333 | 208 | 0.8169 | 0.5300 | 0.8169 | 0.9038 |
No log | 70.0 | 210 | 0.7615 | 0.5216 | 0.7615 | 0.8726 |
No log | 70.6667 | 212 | 0.7481 | 0.5216 | 0.7481 | 0.8649 |
No log | 71.3333 | 214 | 0.7541 | 0.5216 | 0.7541 | 0.8684 |
No log | 72.0 | 216 | 0.7755 | 0.5135 | 0.7755 | 0.8806 |
No log | 72.6667 | 218 | 0.8363 | 0.5503 | 0.8363 | 0.9145 |
No log | 73.3333 | 220 | 0.8890 | 0.5474 | 0.8890 | 0.9429 |
No log | 74.0 | 222 | 0.9038 | 0.5216 | 0.9038 | 0.9507 |
No log | 74.6667 | 224 | 0.9102 | 0.5216 | 0.9102 | 0.9540 |
No log | 75.3333 | 226 | 0.8665 | 0.5276 | 0.8665 | 0.9309 |
No log | 76.0 | 228 | 0.8022 | 0.4470 | 0.8022 | 0.8956 |
No log | 76.6667 | 230 | 0.7517 | 0.4638 | 0.7517 | 0.8670 |
No log | 77.3333 | 232 | 0.7167 | 0.4711 | 0.7167 | 0.8466 |
No log | 78.0 | 234 | 0.7063 | 0.4711 | 0.7063 | 0.8404 |
No log | 78.6667 | 236 | 0.7091 | 0.4897 | 0.7091 | 0.8421 |
No log | 79.3333 | 238 | 0.7258 | 0.5041 | 0.7258 | 0.8519 |
No log | 80.0 | 240 | 0.7583 | 0.5041 | 0.7583 | 0.8708 |
No log | 80.6667 | 242 | 0.7985 | 0.4973 | 0.7985 | 0.8936 |
No log | 81.3333 | 244 | 0.8054 | 0.4973 | 0.8054 | 0.8974 |
No log | 82.0 | 246 | 0.7897 | 0.4973 | 0.7897 | 0.8886 |
No log | 82.6667 | 248 | 0.7746 | 0.4973 | 0.7746 | 0.8801 |
No log | 83.3333 | 250 | 0.7633 | 0.4959 | 0.7633 | 0.8736 |
No log | 84.0 | 252 | 0.7711 | 0.5182 | 0.7711 | 0.8781 |
No log | 84.6667 | 254 | 0.7830 | 0.5182 | 0.7830 | 0.8849 |
No log | 85.3333 | 256 | 0.8057 | 0.5488 | 0.8057 | 0.8976 |
No log | 86.0 | 258 | 0.8166 | 0.5488 | 0.8166 | 0.9037 |
No log | 86.6667 | 260 | 0.8115 | 0.5488 | 0.8115 | 0.9008 |
No log | 87.3333 | 262 | 0.7979 | 0.5381 | 0.7979 | 0.8932 |
No log | 88.0 | 264 | 0.7697 | 0.5361 | 0.7697 | 0.8773 |
No log | 88.6667 | 266 | 0.7499 | 0.5164 | 0.7499 | 0.8659 |
No log | 89.3333 | 268 | 0.7414 | 0.5164 | 0.7414 | 0.8611 |
No log | 90.0 | 270 | 0.7299 | 0.5599 | 0.7299 | 0.8544 |
No log | 90.6667 | 272 | 0.7249 | 0.5599 | 0.7249 | 0.8514 |
No log | 91.3333 | 274 | 0.7292 | 0.5599 | 0.7292 | 0.8539 |
No log | 92.0 | 276 | 0.7410 | 0.5517 | 0.7410 | 0.8608 |
No log | 92.6667 | 278 | 0.7620 | 0.5381 | 0.7620 | 0.8729 |
No log | 93.3333 | 280 | 0.7856 | 0.5381 | 0.7856 | 0.8863 |
No log | 94.0 | 282 | 0.8121 | 0.5230 | 0.8121 | 0.9012 |
No log | 94.6667 | 284 | 0.8328 | 0.5374 | 0.8328 | 0.9126 |
No log | 95.3333 | 286 | 0.8423 | 0.5374 | 0.8423 | 0.9178 |
No log | 96.0 | 288 | 0.8483 | 0.5374 | 0.8483 | 0.9211 |
No log | 96.6667 | 290 | 0.8486 | 0.5374 | 0.8486 | 0.9212 |
No log | 97.3333 | 292 | 0.8443 | 0.5374 | 0.8443 | 0.9189 |
No log | 98.0 | 294 | 0.8379 | 0.5374 | 0.8379 | 0.9154 |
No log | 98.6667 | 296 | 0.8309 | 0.5374 | 0.8309 | 0.9115 |
No log | 99.3333 | 298 | 0.8267 | 0.5374 | 0.8267 | 0.9092 |
No log | 100.0 | 300 | 0.8245 | 0.5349 | 0.8245 | 0.9080 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.0+cu118
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 5
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for MayBashendy/ArabicNewSplits8_FineTuningAraBERT_noAug_task5_organization
Base model
aubmindlab/bert-base-arabertv02