scenario-kd-pre-ner-full_data-univner_full44
This model is a fine-tuned version of FacebookAI/xlm-roberta-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4193
- Precision: 0.8281
- Recall: 0.8175
- F1: 0.8228
- Accuracy: 0.9817
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 44
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
1.4887 | 0.2910 | 500 | 0.8780 | 0.6630 | 0.6943 | 0.6783 | 0.9693 |
0.7434 | 0.5821 | 1000 | 0.7247 | 0.7040 | 0.7534 | 0.7279 | 0.9735 |
0.6458 | 0.8731 | 1500 | 0.6695 | 0.7248 | 0.7692 | 0.7463 | 0.9752 |
0.562 | 1.1641 | 2000 | 0.6301 | 0.7474 | 0.7830 | 0.7648 | 0.9768 |
0.5142 | 1.4552 | 2500 | 0.6432 | 0.7960 | 0.7413 | 0.7677 | 0.9773 |
0.4901 | 1.7462 | 3000 | 0.5939 | 0.7664 | 0.7689 | 0.7676 | 0.9768 |
0.4625 | 2.0373 | 3500 | 0.5650 | 0.7763 | 0.7899 | 0.7830 | 0.9784 |
0.4058 | 2.3283 | 4000 | 0.5634 | 0.7941 | 0.7807 | 0.7873 | 0.9786 |
0.3987 | 2.6193 | 4500 | 0.5539 | 0.7763 | 0.8000 | 0.7880 | 0.9789 |
0.3879 | 2.9104 | 5000 | 0.5344 | 0.7848 | 0.8042 | 0.7944 | 0.9789 |
0.3586 | 3.2014 | 5500 | 0.5491 | 0.7907 | 0.7915 | 0.7911 | 0.9791 |
0.3418 | 3.4924 | 6000 | 0.5209 | 0.7757 | 0.8121 | 0.7935 | 0.9789 |
0.3334 | 3.7835 | 6500 | 0.5221 | 0.7954 | 0.8000 | 0.7977 | 0.9797 |
0.3254 | 4.0745 | 7000 | 0.5208 | 0.8027 | 0.7943 | 0.7985 | 0.9793 |
0.3037 | 4.3655 | 7500 | 0.5119 | 0.7912 | 0.8000 | 0.7956 | 0.9797 |
0.2948 | 4.6566 | 8000 | 0.5057 | 0.7966 | 0.8062 | 0.8014 | 0.9796 |
0.2929 | 4.9476 | 8500 | 0.5024 | 0.8051 | 0.7987 | 0.8019 | 0.9796 |
0.2721 | 5.2386 | 9000 | 0.5030 | 0.8024 | 0.7868 | 0.7945 | 0.9796 |
0.2654 | 5.5297 | 9500 | 0.4919 | 0.8124 | 0.7940 | 0.8031 | 0.9800 |
0.2664 | 5.8207 | 10000 | 0.4992 | 0.7986 | 0.8121 | 0.8053 | 0.9798 |
0.2582 | 6.1118 | 10500 | 0.4874 | 0.8126 | 0.8000 | 0.8063 | 0.9802 |
0.2404 | 6.4028 | 11000 | 0.4980 | 0.8081 | 0.8000 | 0.8040 | 0.9803 |
0.2408 | 6.6938 | 11500 | 0.4875 | 0.8026 | 0.8036 | 0.8031 | 0.9800 |
0.2416 | 6.9849 | 12000 | 0.4830 | 0.8074 | 0.7982 | 0.8027 | 0.9799 |
0.2246 | 7.2759 | 12500 | 0.4750 | 0.8084 | 0.8116 | 0.8100 | 0.9805 |
0.2225 | 7.5669 | 13000 | 0.4839 | 0.8017 | 0.8162 | 0.8089 | 0.9807 |
0.2235 | 7.8580 | 13500 | 0.4676 | 0.8052 | 0.8134 | 0.8093 | 0.9807 |
0.2111 | 8.1490 | 14000 | 0.4718 | 0.8151 | 0.8022 | 0.8086 | 0.9806 |
0.207 | 8.4400 | 14500 | 0.4777 | 0.8036 | 0.8165 | 0.8100 | 0.9802 |
0.2063 | 8.7311 | 15000 | 0.4704 | 0.8250 | 0.7995 | 0.8120 | 0.9806 |
0.2051 | 9.0221 | 15500 | 0.4718 | 0.8027 | 0.8119 | 0.8073 | 0.9803 |
0.1922 | 9.3132 | 16000 | 0.4767 | 0.8154 | 0.8077 | 0.8115 | 0.9806 |
0.192 | 9.6042 | 16500 | 0.4735 | 0.8160 | 0.8114 | 0.8137 | 0.9811 |
0.1946 | 9.8952 | 17000 | 0.4711 | 0.8100 | 0.8176 | 0.8138 | 0.9807 |
0.1843 | 10.1863 | 17500 | 0.4666 | 0.8096 | 0.8113 | 0.8105 | 0.9808 |
0.1801 | 10.4773 | 18000 | 0.4606 | 0.8064 | 0.8150 | 0.8107 | 0.9805 |
0.1824 | 10.7683 | 18500 | 0.4573 | 0.8158 | 0.8133 | 0.8145 | 0.9810 |
0.1775 | 11.0594 | 19000 | 0.4733 | 0.8209 | 0.7951 | 0.8078 | 0.9803 |
0.1723 | 11.3504 | 19500 | 0.4567 | 0.8164 | 0.8168 | 0.8166 | 0.9813 |
0.1716 | 11.6414 | 20000 | 0.4596 | 0.8153 | 0.8062 | 0.8107 | 0.9809 |
0.1696 | 11.9325 | 20500 | 0.4553 | 0.8141 | 0.8250 | 0.8195 | 0.9813 |
0.165 | 12.2235 | 21000 | 0.4474 | 0.8225 | 0.8114 | 0.8169 | 0.9810 |
0.1609 | 12.5146 | 21500 | 0.4638 | 0.8189 | 0.8094 | 0.8141 | 0.9810 |
0.1648 | 12.8056 | 22000 | 0.4459 | 0.8122 | 0.8120 | 0.8121 | 0.9809 |
0.1599 | 13.0966 | 22500 | 0.4509 | 0.8184 | 0.8104 | 0.8144 | 0.9811 |
0.1556 | 13.3877 | 23000 | 0.4603 | 0.8167 | 0.8062 | 0.8114 | 0.9808 |
0.1559 | 13.6787 | 23500 | 0.4515 | 0.8163 | 0.8150 | 0.8157 | 0.9807 |
0.1546 | 13.9697 | 24000 | 0.4436 | 0.8089 | 0.8225 | 0.8157 | 0.9809 |
0.1487 | 14.2608 | 24500 | 0.4422 | 0.8114 | 0.8228 | 0.8170 | 0.9811 |
0.1503 | 14.5518 | 25000 | 0.4467 | 0.8180 | 0.8169 | 0.8174 | 0.9813 |
0.1485 | 14.8428 | 25500 | 0.4508 | 0.8098 | 0.8215 | 0.8156 | 0.9807 |
0.1466 | 15.1339 | 26000 | 0.4441 | 0.8157 | 0.8147 | 0.8152 | 0.9812 |
0.1432 | 15.4249 | 26500 | 0.4473 | 0.8242 | 0.8111 | 0.8176 | 0.9813 |
0.1431 | 15.7159 | 27000 | 0.4513 | 0.8194 | 0.8159 | 0.8177 | 0.9812 |
0.1444 | 16.0070 | 27500 | 0.4381 | 0.8166 | 0.8209 | 0.8188 | 0.9812 |
0.1373 | 16.2980 | 28000 | 0.4420 | 0.8163 | 0.8234 | 0.8199 | 0.9815 |
0.1375 | 16.5891 | 28500 | 0.4395 | 0.8203 | 0.8179 | 0.8191 | 0.9815 |
0.1405 | 16.8801 | 29000 | 0.4409 | 0.8227 | 0.8126 | 0.8176 | 0.9810 |
0.1369 | 17.1711 | 29500 | 0.4371 | 0.8259 | 0.8124 | 0.8191 | 0.9811 |
0.1345 | 17.4622 | 30000 | 0.4428 | 0.8248 | 0.8096 | 0.8171 | 0.9809 |
0.1356 | 17.7532 | 30500 | 0.4341 | 0.8275 | 0.8175 | 0.8225 | 0.9815 |
0.1323 | 18.0442 | 31000 | 0.4312 | 0.8229 | 0.8199 | 0.8214 | 0.9813 |
0.1302 | 18.3353 | 31500 | 0.4308 | 0.8242 | 0.8222 | 0.8232 | 0.9819 |
0.1302 | 18.6263 | 32000 | 0.4308 | 0.8217 | 0.8159 | 0.8188 | 0.9814 |
0.1308 | 18.9173 | 32500 | 0.4371 | 0.8274 | 0.8042 | 0.8156 | 0.9813 |
0.1289 | 19.2084 | 33000 | 0.4339 | 0.8305 | 0.8108 | 0.8206 | 0.9815 |
0.1273 | 19.4994 | 33500 | 0.4358 | 0.8176 | 0.8158 | 0.8167 | 0.9813 |
0.1271 | 19.7905 | 34000 | 0.4403 | 0.8229 | 0.8123 | 0.8175 | 0.9810 |
0.125 | 20.0815 | 34500 | 0.4280 | 0.8235 | 0.8201 | 0.8218 | 0.9815 |
0.1259 | 20.3725 | 35000 | 0.4341 | 0.8243 | 0.8124 | 0.8183 | 0.9812 |
0.1233 | 20.6636 | 35500 | 0.4327 | 0.8282 | 0.8075 | 0.8177 | 0.9812 |
0.1243 | 20.9546 | 36000 | 0.4253 | 0.8252 | 0.8192 | 0.8222 | 0.9814 |
0.1233 | 21.2456 | 36500 | 0.4333 | 0.8203 | 0.8114 | 0.8158 | 0.9812 |
0.1202 | 21.5367 | 37000 | 0.4253 | 0.8196 | 0.8168 | 0.8182 | 0.9814 |
0.1223 | 21.8277 | 37500 | 0.4234 | 0.8311 | 0.8142 | 0.8225 | 0.9815 |
0.1215 | 22.1187 | 38000 | 0.4203 | 0.8249 | 0.8197 | 0.8223 | 0.9818 |
0.1177 | 22.4098 | 38500 | 0.4200 | 0.8280 | 0.8225 | 0.8253 | 0.9818 |
0.1198 | 22.7008 | 39000 | 0.4257 | 0.8267 | 0.8199 | 0.8233 | 0.9818 |
0.1187 | 22.9919 | 39500 | 0.4253 | 0.8274 | 0.8222 | 0.8248 | 0.9817 |
0.1179 | 23.2829 | 40000 | 0.4261 | 0.8267 | 0.8163 | 0.8215 | 0.9812 |
0.1168 | 23.5739 | 40500 | 0.4203 | 0.8295 | 0.8156 | 0.8225 | 0.9815 |
0.1174 | 23.8650 | 41000 | 0.4216 | 0.8278 | 0.8145 | 0.8211 | 0.9816 |
0.1159 | 24.1560 | 41500 | 0.4226 | 0.8271 | 0.8207 | 0.8239 | 0.9818 |
0.1147 | 24.4470 | 42000 | 0.4274 | 0.8328 | 0.8147 | 0.8237 | 0.9814 |
0.1168 | 24.7381 | 42500 | 0.4240 | 0.8221 | 0.8147 | 0.8184 | 0.9815 |
0.1148 | 25.0291 | 43000 | 0.4222 | 0.8224 | 0.8119 | 0.8171 | 0.9814 |
0.1141 | 25.3201 | 43500 | 0.4179 | 0.8248 | 0.8197 | 0.8222 | 0.9818 |
0.1142 | 25.6112 | 44000 | 0.4204 | 0.8235 | 0.8178 | 0.8206 | 0.9815 |
0.1131 | 25.9022 | 44500 | 0.4190 | 0.8342 | 0.8222 | 0.8282 | 0.9818 |
0.114 | 26.1932 | 45000 | 0.4247 | 0.8289 | 0.8201 | 0.8245 | 0.9816 |
0.1119 | 26.4843 | 45500 | 0.4198 | 0.8290 | 0.8179 | 0.8234 | 0.9815 |
0.1132 | 26.7753 | 46000 | 0.4221 | 0.8224 | 0.8166 | 0.8195 | 0.9814 |
0.1125 | 27.0664 | 46500 | 0.4216 | 0.8306 | 0.8129 | 0.8216 | 0.9814 |
0.1103 | 27.3574 | 47000 | 0.4232 | 0.8260 | 0.8126 | 0.8193 | 0.9813 |
0.1113 | 27.6484 | 47500 | 0.4200 | 0.8321 | 0.8150 | 0.8235 | 0.9815 |
0.112 | 27.9395 | 48000 | 0.4186 | 0.8285 | 0.8227 | 0.8256 | 0.9817 |
0.111 | 28.2305 | 48500 | 0.4203 | 0.8326 | 0.8182 | 0.8254 | 0.9817 |
0.1095 | 28.5215 | 49000 | 0.4194 | 0.8300 | 0.8173 | 0.8236 | 0.9816 |
0.1104 | 28.8126 | 49500 | 0.4212 | 0.8246 | 0.8192 | 0.8219 | 0.9815 |
0.1098 | 29.1036 | 50000 | 0.4189 | 0.8278 | 0.8165 | 0.8221 | 0.9814 |
0.1097 | 29.3946 | 50500 | 0.4176 | 0.8322 | 0.8173 | 0.8247 | 0.9817 |
0.1098 | 29.6857 | 51000 | 0.4173 | 0.8252 | 0.8165 | 0.8208 | 0.9813 |
0.1104 | 29.9767 | 51500 | 0.4193 | 0.8281 | 0.8175 | 0.8228 | 0.9817 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.19.1
- Downloads last month
- 1
Model tree for haryoaw/scenario-kd-pre-ner-full_data-univner_full44
Base model
FacebookAI/xlm-roberta-base