groderg's picture
Evaluation on the test set completed on 2024_09_18.
71af6cd verified
|
raw
history blame
11.1 kB
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/dinov2-large
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: Joseph-large-2024_09_16-batch-size32_epochs150_freeze
    results: []

Joseph-large-2024_09_16-batch-size32_epochs150_freeze

This model is a fine-tuned version of facebook/dinov2-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1207
  • F1 Micro: 0.8214
  • F1 Macro: 0.7191
  • Roc Auc: 0.8814
  • Accuracy: 0.3118
  • Learning Rate: 0.0000

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1 Micro F1 Macro Roc Auc Accuracy Rate
No log 1.0 273 0.1776 0.7478 0.5385 0.8364 0.2166 0.001
0.2726 2.0 546 0.1539 0.7697 0.5761 0.8448 0.2453 0.001
0.2726 3.0 819 0.1474 0.7745 0.6098 0.8447 0.2516 0.001
0.1701 4.0 1092 0.1465 0.7739 0.6214 0.8440 0.2536 0.001
0.1701 5.0 1365 0.1452 0.7815 0.6353 0.8503 0.2502 0.001
0.1622 6.0 1638 0.1446 0.7813 0.6142 0.8479 0.2578 0.001
0.1622 7.0 1911 0.1445 0.7801 0.6233 0.8500 0.2620 0.001
0.159 8.0 2184 0.1437 0.7879 0.6339 0.8585 0.2585 0.001
0.159 9.0 2457 0.1447 0.7855 0.6443 0.8548 0.2578 0.001
0.1563 10.0 2730 0.1539 0.7683 0.6149 0.8341 0.2443 0.001
0.1558 11.0 3003 0.1389 0.7897 0.6335 0.8561 0.2633 0.001
0.1558 12.0 3276 0.1395 0.7908 0.6406 0.8586 0.2640 0.001
0.155 13.0 3549 0.1390 0.7894 0.6557 0.8535 0.2651 0.001
0.155 14.0 3822 0.1391 0.7878 0.6405 0.8540 0.2623 0.001
0.154 15.0 4095 0.1399 0.7885 0.6406 0.8550 0.2540 0.001
0.154 16.0 4368 0.1394 0.7848 0.6375 0.8490 0.2668 0.001
0.1527 17.0 4641 0.1594 0.7857 0.6425 0.8640 0.2419 0.001
0.1527 18.0 4914 0.1319 0.8037 0.6768 0.8679 0.2755 0.0001
0.149 19.0 5187 0.1324 0.8038 0.6715 0.8680 0.2789 0.0001
0.149 20.0 5460 0.1306 0.8066 0.6734 0.8722 0.2789 0.0001
0.1412 21.0 5733 0.1303 0.8037 0.6728 0.8651 0.2817 0.0001
0.1385 22.0 6006 0.1287 0.8074 0.6735 0.8697 0.2841 0.0001
0.1385 23.0 6279 0.1287 0.8058 0.6785 0.8654 0.2841 0.0001
0.1377 24.0 6552 0.1280 0.8058 0.6841 0.8663 0.2869 0.0001
0.1377 25.0 6825 0.1274 0.8074 0.6787 0.8696 0.2859 0.0001
0.1361 26.0 7098 0.1283 0.8064 0.6740 0.8673 0.2859 0.0001
0.1361 27.0 7371 0.1268 0.8110 0.6890 0.8744 0.2883 0.0001
0.1354 28.0 7644 0.1267 0.8100 0.6813 0.8708 0.2893 0.0001
0.1354 29.0 7917 0.1268 0.8081 0.6881 0.8667 0.2918 0.0001
0.1339 30.0 8190 0.1264 0.8109 0.6873 0.8701 0.2928 0.0001
0.1339 31.0 8463 0.1258 0.8089 0.6824 0.8674 0.2914 0.0001
0.1332 32.0 8736 0.1260 0.8113 0.6924 0.8731 0.2931 0.0001
0.1321 33.0 9009 0.1250 0.8133 0.6960 0.8736 0.2911 0.0001
0.1321 34.0 9282 0.1251 0.8116 0.6891 0.8708 0.2942 0.0001
0.1309 35.0 9555 0.1249 0.8124 0.6945 0.8724 0.2956 0.0001
0.1309 36.0 9828 0.1253 0.8115 0.6971 0.8688 0.2942 0.0001
0.1305 37.0 10101 0.1248 0.8116 0.6961 0.8702 0.2952 0.0001
0.1305 38.0 10374 0.1250 0.8130 0.6991 0.8726 0.3004 0.0001
0.1285 39.0 10647 0.1252 0.8142 0.6971 0.8768 0.2952 0.0001
0.1285 40.0 10920 0.1249 0.8167 0.7070 0.8790 0.2956 0.0001
0.129 41.0 11193 0.1250 0.8104 0.6962 0.8684 0.2897 0.0001
0.129 42.0 11466 0.1235 0.8165 0.7064 0.8763 0.3039 0.0001
0.1277 43.0 11739 0.1237 0.8150 0.7047 0.8771 0.2956 0.0001
0.1279 44.0 12012 0.1237 0.8170 0.7054 0.8789 0.3008 0.0001
0.1279 45.0 12285 0.1233 0.8163 0.7058 0.8758 0.3015 0.0001
0.1264 46.0 12558 0.1230 0.8159 0.6993 0.8746 0.3008 0.0001
0.1264 47.0 12831 0.1237 0.8135 0.7026 0.8720 0.2990 0.0001
0.1267 48.0 13104 0.1233 0.8169 0.7044 0.8757 0.3018 0.0001
0.1267 49.0 13377 0.1232 0.8161 0.7050 0.8762 0.3021 0.0001
0.1249 50.0 13650 0.1227 0.8180 0.7086 0.8775 0.3015 0.0001
0.1249 51.0 13923 0.1231 0.8190 0.7108 0.8794 0.3021 0.0001
0.1243 52.0 14196 0.1228 0.8164 0.7041 0.8743 0.3021 0.0001
0.1243 53.0 14469 0.1225 0.8189 0.7080 0.8794 0.3039 0.0001
0.1248 54.0 14742 0.1238 0.8163 0.7054 0.8755 0.3018 0.0001
0.1233 55.0 15015 0.1221 0.8181 0.7093 0.8772 0.3028 0.0001
0.1233 56.0 15288 0.1226 0.8188 0.7092 0.8809 0.3049 0.0001
0.1237 57.0 15561 0.1223 0.8184 0.7056 0.8785 0.3053 0.0001
0.1237 58.0 15834 0.1223 0.8180 0.7094 0.8765 0.3028 0.0001
0.1234 59.0 16107 0.1223 0.8198 0.7102 0.8789 0.3073 0.0001
0.1234 60.0 16380 0.1237 0.8173 0.7068 0.8762 0.2980 0.0001
0.1232 61.0 16653 0.1224 0.8201 0.7139 0.8791 0.3060 0.0001
0.1232 62.0 16926 0.1222 0.8209 0.7189 0.8808 0.3028 1e-05
0.1204 63.0 17199 0.1208 0.8208 0.7191 0.8797 0.3098 1e-05
0.1204 64.0 17472 0.1209 0.8218 0.7188 0.8813 0.3108 1e-05
0.12 65.0 17745 0.1209 0.8210 0.7187 0.8787 0.3080 1e-05
0.1187 66.0 18018 0.1208 0.8216 0.7186 0.8805 0.3136 1e-05
0.1187 67.0 18291 0.1210 0.8232 0.7239 0.8848 0.3112 1e-05
0.1179 68.0 18564 0.1208 0.8212 0.7201 0.8815 0.3125 1e-05
0.1179 69.0 18837 0.1211 0.8210 0.7198 0.8795 0.3101 1e-05
0.1177 70.0 19110 0.1211 0.8213 0.7197 0.8802 0.3112 1e-05
0.1177 71.0 19383 0.1206 0.8206 0.7164 0.8780 0.3112 1e-05
0.1179 72.0 19656 0.1208 0.8206 0.7172 0.8783 0.3129 1e-05
0.1179 73.0 19929 0.1208 0.8217 0.7214 0.8804 0.3132 1e-05
0.1177 74.0 20202 0.1209 0.8201 0.7155 0.8760 0.3108 1e-05
0.1177 75.0 20475 0.1205 0.8207 0.7151 0.8790 0.3153 1e-05
0.1171 76.0 20748 0.1203 0.8221 0.7224 0.8820 0.3157 1e-05
0.1171 77.0 21021 0.1208 0.8232 0.7234 0.8851 0.3136 1e-05
0.1171 78.0 21294 0.1210 0.8230 0.7233 0.8837 0.3115 1e-05
0.1168 79.0 21567 0.1205 0.8202 0.7173 0.8777 0.3101 1e-05
0.1168 80.0 21840 0.1207 0.8232 0.7249 0.8843 0.3119 1e-05
0.1171 81.0 22113 0.1203 0.8221 0.7213 0.8806 0.3129 1e-05
0.1171 82.0 22386 0.1205 0.8215 0.7178 0.8796 0.3143 1e-05
0.1157 83.0 22659 0.1214 0.8180 0.7113 0.8743 0.3112 0.0000
0.1157 84.0 22932 0.1204 0.8234 0.7251 0.8827 0.3115 0.0000
0.1169 85.0 23205 0.1204 0.8230 0.7213 0.8832 0.3132 0.0000
0.1169 86.0 23478 0.1225 0.8196 0.7218 0.8800 0.3077 0.0000
0.1157 87.0 23751 0.1208 0.8204 0.7152 0.8789 0.3091 0.0000
0.1156 88.0 24024 0.1209 0.8215 0.7168 0.8824 0.3084 0.0000
0.1156 89.0 24297 0.1211 0.8245 0.7340 0.8875 0.3164 0.0000
0.1157 90.0 24570 0.1209 0.8232 0.7246 0.8861 0.3119 0.0000
0.1157 91.0 24843 0.1204 0.8201 0.7163 0.8785 0.3115 0.0000

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.0
  • Tokenizers 0.19.1