Sfaya-W-Ary-Arz-MSA-v2
This model is a fine-tuned version of aubmindlab/bert-large-arabertv02 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0181
- Accuracy: 0.9934
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.7461 | 0.1862 | 100 | 0.1725 | 0.9259 |
0.0787 | 0.3724 | 200 | 0.1039 | 0.9823 |
0.0752 | 0.5587 | 300 | 0.0383 | 0.9923 |
0.077 | 0.7449 | 400 | 0.0367 | 0.9923 |
0.107 | 0.9311 | 500 | 0.0181 | 0.9934 |
1.1834 | 1.1173 | 600 | 1.1050 | 0.3208 |
1.1227 | 1.3035 | 700 | 1.1039 | 0.3208 |
1.1254 | 1.4898 | 800 | 1.0972 | 0.5719 |
1.1284 | 1.6760 | 900 | 1.1011 | 0.3208 |
1.1155 | 1.8622 | 1000 | 1.1178 | 0.5719 |
1.1226 | 2.0484 | 1100 | 1.1034 | 0.1073 |
1.12 | 2.2346 | 1200 | 1.1215 | 0.1073 |
1.1231 | 2.4209 | 1300 | 1.1328 | 0.5719 |
1.1218 | 2.6071 | 1400 | 1.1038 | 0.3208 |
1.1238 | 2.7933 | 1500 | 1.1014 | 0.3208 |
1.1175 | 2.9795 | 1600 | 1.1151 | 0.1073 |
1.1211 | 3.1657 | 1700 | 1.1041 | 0.1073 |
1.1115 | 3.3520 | 1800 | 1.0970 | 0.3208 |
1.1181 | 3.5382 | 1900 | 1.1004 | 0.1073 |
1.1236 | 3.7244 | 2000 | 1.1096 | 0.1073 |
1.1193 | 3.9106 | 2100 | 1.0980 | 0.5719 |
1.1188 | 4.0968 | 2200 | 1.0992 | 0.5719 |
1.1162 | 4.2831 | 2300 | 1.1094 | 0.3208 |
1.1201 | 4.4693 | 2400 | 1.1133 | 0.1073 |
1.1283 | 4.6555 | 2500 | 1.0976 | 0.3208 |
1.1221 | 4.8417 | 2600 | 1.0964 | 0.5719 |
1.1231 | 5.0279 | 2700 | 1.0996 | 0.1073 |
1.1098 | 5.2142 | 2800 | 1.0979 | 0.3208 |
1.1176 | 5.4004 | 2900 | 1.1053 | 0.3208 |
1.1207 | 5.5866 | 3000 | 1.0993 | 0.5719 |
1.1112 | 5.7728 | 3100 | 1.0965 | 0.5719 |
1.1168 | 5.9590 | 3200 | 1.1187 | 0.5719 |
1.1173 | 6.1453 | 3300 | 1.1025 | 0.1073 |
1.1154 | 6.3315 | 3400 | 1.0999 | 0.3208 |
1.116 | 6.5177 | 3500 | 1.1015 | 0.1073 |
1.1189 | 6.7039 | 3600 | 1.0981 | 0.5719 |
1.1179 | 6.8901 | 3700 | 1.0968 | 0.3208 |
1.1106 | 7.0764 | 3800 | 1.1123 | 0.3208 |
1.1098 | 7.2626 | 3900 | 1.1031 | 0.5719 |
1.118 | 7.4488 | 4000 | 1.0983 | 0.3208 |
1.1137 | 7.6350 | 4100 | 1.0977 | 0.3208 |
1.1166 | 7.8212 | 4200 | 1.1042 | 0.3208 |
1.1202 | 8.0074 | 4300 | 1.1001 | 0.1073 |
1.1154 | 8.1937 | 4400 | 1.1004 | 0.1073 |
1.1088 | 8.3799 | 4500 | 1.0971 | 0.5719 |
1.1081 | 8.5661 | 4600 | 1.1103 | 0.1073 |
1.1131 | 8.7523 | 4700 | 1.1140 | 0.1073 |
1.1164 | 8.9385 | 4800 | 1.1065 | 0.5719 |
1.1116 | 9.1248 | 4900 | 1.0964 | 0.5719 |
1.1149 | 9.3110 | 5000 | 1.0986 | 0.3208 |
1.1139 | 9.4972 | 5100 | 1.1041 | 0.1073 |
1.1117 | 9.6834 | 5200 | 1.1081 | 0.3208 |
1.1174 | 9.8696 | 5300 | 1.0973 | 0.3208 |
1.1179 | 10.0559 | 5400 | 1.0997 | 0.1073 |
1.1112 | 10.2421 | 5500 | 1.1062 | 0.1073 |
1.1125 | 10.4283 | 5600 | 1.1064 | 0.1073 |
1.1143 | 10.6145 | 5700 | 1.0968 | 0.5719 |
1.1086 | 10.8007 | 5800 | 1.1029 | 0.5719 |
1.1147 | 10.9870 | 5900 | 1.0969 | 0.3208 |
1.1159 | 11.1732 | 6000 | 1.1001 | 0.1073 |
1.1217 | 11.3594 | 6100 | 1.0967 | 0.5719 |
1.1147 | 11.5456 | 6200 | 1.1008 | 0.3208 |
1.1118 | 11.7318 | 6300 | 1.0992 | 0.1073 |
1.1246 | 11.9181 | 6400 | 1.0980 | 0.3208 |
1.1153 | 12.1043 | 6500 | 1.0983 | 0.5719 |
1.1195 | 12.2905 | 6600 | 1.0987 | 0.5719 |
1.12 | 12.4767 | 6700 | 1.1035 | 0.3208 |
1.114 | 12.6629 | 6800 | 1.0971 | 0.3208 |
1.1068 | 12.8492 | 6900 | 1.0972 | 0.5719 |
1.1169 | 13.0354 | 7000 | 1.1070 | 0.1073 |
1.1122 | 13.2216 | 7100 | 1.1055 | 0.1073 |
1.1155 | 13.4078 | 7200 | 1.0976 | 0.5719 |
1.1102 | 13.5940 | 7300 | 1.0972 | 0.3208 |
1.1113 | 13.7803 | 7400 | 1.1034 | 0.1073 |
1.115 | 13.9665 | 7500 | 1.0974 | 0.5719 |
1.1168 | 14.1527 | 7600 | 1.0972 | 0.5719 |
1.1099 | 14.3389 | 7700 | 1.0995 | 0.1073 |
1.1109 | 14.5251 | 7800 | 1.0985 | 0.3208 |
1.1146 | 14.7114 | 7900 | 1.1019 | 0.3208 |
1.1139 | 14.8976 | 8000 | 1.0993 | 0.3208 |
1.0681 | 15.0838 | 8100 | 1.2951 | 0.1073 |
0.9136 | 15.2700 | 8200 | 1.5287 | 0.1073 |
0.9282 | 15.4562 | 8300 | 1.4762 | 0.1073 |
0.8762 | 15.6425 | 8400 | 1.4805 | 0.1073 |
0.8584 | 15.8287 | 8500 | 1.5437 | 0.1073 |
0.8367 | 16.0149 | 8600 | 1.4731 | 0.1073 |
0.822 | 16.2011 | 8700 | 1.5016 | 0.1073 |
0.7898 | 16.3873 | 8800 | 1.5877 | 0.1073 |
0.7783 | 16.5736 | 8900 | 1.5652 | 0.1073 |
0.781 | 16.7598 | 9000 | 1.5188 | 0.1073 |
0.7742 | 16.9460 | 9100 | 1.5622 | 0.1073 |
0.7582 | 17.1322 | 9200 | 1.5295 | 0.1073 |
0.7631 | 17.3184 | 9300 | 1.5113 | 0.1073 |
0.7522 | 17.5047 | 9400 | 1.5878 | 0.1073 |
0.744 | 17.6909 | 9500 | 1.5649 | 0.1073 |
0.7468 | 17.8771 | 9600 | 1.5520 | 0.1073 |
0.7537 | 18.0633 | 9700 | 1.5341 | 0.1073 |
0.734 | 18.2495 | 9800 | 1.5497 | 0.1073 |
0.7426 | 18.4358 | 9900 | 1.5060 | 0.1073 |
0.7433 | 18.6220 | 10000 | 1.4848 | 0.1073 |
0.7259 | 18.8082 | 10100 | 1.5113 | 0.1073 |
0.7366 | 18.9944 | 10200 | 1.4542 | 0.1073 |
0.7455 | 19.1806 | 10300 | 1.5123 | 0.1073 |
0.7153 | 19.3669 | 10400 | 1.5229 | 0.1073 |
0.738 | 19.5531 | 10500 | 1.4720 | 0.1073 |
0.739 | 19.7393 | 10600 | 1.4896 | 0.1073 |
0.7339 | 19.9255 | 10700 | 1.5350 | 0.1073 |
0.6962 | 20.1117 | 10800 | 1.5208 | 0.1073 |
0.6877 | 20.2980 | 10900 | 1.5533 | 0.1073 |
0.6979 | 20.4842 | 11000 | 1.6745 | 0.1073 |
0.7122 | 20.6704 | 11100 | 1.6682 | 0.1073 |
0.724 | 20.8566 | 11200 | 1.5583 | 0.1073 |
0.7243 | 21.0428 | 11300 | 1.6967 | 0.1073 |
0.7025 | 21.2291 | 11400 | 1.8119 | 0.1073 |
0.7003 | 21.4153 | 11500 | 1.8016 | 0.1073 |
0.6928 | 21.6015 | 11600 | 1.8487 | 0.1073 |
0.7033 | 21.7877 | 11700 | 1.8907 | 0.1073 |
0.6718 | 21.9739 | 11800 | 1.9132 | 0.1073 |
0.6782 | 22.1601 | 11900 | 1.9787 | 0.1073 |
0.6721 | 22.3464 | 12000 | 2.0347 | 0.1073 |
0.6697 | 22.5326 | 12100 | 2.0064 | 0.1073 |
0.676 | 22.7188 | 12200 | 2.0380 | 0.1073 |
0.6785 | 22.9050 | 12300 | 2.0832 | 0.1073 |
0.694 | 23.0912 | 12400 | 2.1028 | 0.1073 |
0.6591 | 23.2775 | 12500 | 2.0140 | 0.1073 |
0.6599 | 23.4637 | 12600 | 2.1014 | 0.1073 |
0.6674 | 23.6499 | 12700 | 2.0608 | 0.1073 |
0.6726 | 23.8361 | 12800 | 1.9355 | 0.1073 |
0.6499 | 24.0223 | 12900 | 1.9992 | 0.1073 |
0.6346 | 24.2086 | 13000 | 2.1414 | 0.1073 |
0.6336 | 24.3948 | 13100 | 2.1082 | 0.1073 |
0.6409 | 24.5810 | 13200 | 2.1437 | 0.1073 |
0.6423 | 24.7672 | 13300 | 2.1663 | 0.1073 |
0.6341 | 24.9534 | 13400 | 2.1782 | 0.1073 |
0.6017 | 25.1397 | 13500 | 2.1661 | 0.1073 |
0.6422 | 25.3259 | 13600 | 2.1839 | 0.1073 |
0.6446 | 25.5121 | 13700 | 2.1471 | 0.1073 |
0.6309 | 25.6983 | 13800 | 2.1819 | 0.1073 |
0.6152 | 25.8845 | 13900 | 2.1905 | 0.1073 |
0.6204 | 26.0708 | 14000 | 2.1705 | 0.1073 |
0.6161 | 26.2570 | 14100 | 2.1816 | 0.1073 |
0.6304 | 26.4432 | 14200 | 2.1171 | 0.1073 |
0.6075 | 26.6294 | 14300 | 2.2054 | 0.1073 |
0.6224 | 26.8156 | 14400 | 2.1776 | 0.1073 |
0.6019 | 27.0019 | 14500 | 2.2348 | 0.1073 |
0.6168 | 27.1881 | 14600 | 2.2317 | 0.1073 |
0.6371 | 27.3743 | 14700 | 2.1990 | 0.1073 |
0.5835 | 27.5605 | 14800 | 2.2093 | 0.1073 |
0.603 | 27.7467 | 14900 | 2.1909 | 0.1073 |
0.6151 | 27.9330 | 15000 | 2.1936 | 0.1073 |
0.5877 | 28.1192 | 15100 | 2.2035 | 0.1073 |
0.5989 | 28.3054 | 15200 | 2.2184 | 0.1073 |
0.5882 | 28.4916 | 15300 | 2.2206 | 0.1073 |
0.6142 | 28.6778 | 15400 | 2.2295 | 0.1073 |
0.6037 | 28.8641 | 15500 | 2.2229 | 0.1073 |
0.5787 | 29.0503 | 15600 | 2.2189 | 0.1073 |
0.602 | 29.2365 | 15700 | 2.2256 | 0.1073 |
0.5831 | 29.4227 | 15800 | 2.2210 | 0.1073 |
0.5661 | 29.6089 | 15900 | 2.2218 | 0.1073 |
0.5827 | 29.7952 | 16000 | 2.2205 | 0.1073 |
0.6213 | 29.9814 | 16100 | 2.2204 | 0.1073 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.3
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for atlasia/Sfaya-W-Ary-Arz-MSA-v2
Base model
aubmindlab/bert-large-arabertv02