vishalkatheriya18's picture
End of training
8c32461 verified
metadata
license: apache-2.0
base_model: facebook/convnextv2-tiny-1k-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: convnextv2-tiny-1k-224-finetuned-pattern-edge
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.81

convnextv2-tiny-1k-224-finetuned-pattern-edge

This model is a fine-tuned version of facebook/convnextv2-tiny-1k-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9202
  • Accuracy: 0.81

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 120

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.302 0.9912 28 2.2666 0.1575
2.2226 1.9823 56 2.1654 0.315
2.0639 2.9735 84 1.9970 0.445
1.8559 4.0 113 1.7373 0.56
1.5966 4.9912 141 1.4823 0.605
1.3967 5.9823 169 1.2925 0.6125
1.204 6.9735 197 1.0512 0.68
1.0206 8.0 226 0.9307 0.7025
0.9408 8.9912 254 0.8286 0.7425
0.8501 9.9823 282 0.8590 0.6975
0.7545 10.9735 310 0.7702 0.7475
0.7484 12.0 339 0.7739 0.765
0.6909 12.9912 367 0.7344 0.75
0.6558 13.9823 395 0.6874 0.775
0.5923 14.9735 423 0.6641 0.7675
0.5764 16.0 452 0.6110 0.7925
0.5235 16.9912 480 0.6806 0.76
0.4883 17.9823 508 0.7903 0.76
0.4682 18.9735 536 0.6469 0.7825
0.441 20.0 565 0.6694 0.7825
0.4201 20.9912 593 0.7145 0.7625
0.387 21.9823 621 0.6505 0.7775
0.4034 22.9735 649 0.6169 0.7875
0.3041 24.0 678 0.6416 0.795
0.3021 24.9912 706 0.6992 0.775
0.2853 25.9823 734 0.6566 0.7975
0.27 26.9735 762 0.6970 0.7825
0.2722 28.0 791 0.6863 0.785
0.2143 28.9912 819 0.6794 0.795
0.2238 29.9823 847 0.6782 0.7975
0.2387 30.9735 875 0.6945 0.81
0.223 32.0 904 0.7377 0.7825
0.2211 32.9912 932 0.7431 0.7775
0.1882 33.9823 960 0.7029 0.815
0.1562 34.9735 988 0.6887 0.815
0.1689 36.0 1017 0.7190 0.7975
0.1886 36.9912 1045 0.7678 0.795
0.1887 37.9823 1073 0.7334 0.81
0.1531 38.9735 1101 0.7359 0.7925
0.1662 40.0 1130 0.7594 0.8075
0.1273 40.9912 1158 0.7342 0.81
0.1986 41.9823 1186 0.7781 0.805
0.1891 42.9735 1214 0.7376 0.8225
0.1573 44.0 1243 0.7304 0.815
0.1536 44.9912 1271 0.7773 0.8
0.1562 45.9823 1299 0.7623 0.8
0.1264 46.9735 1327 0.8314 0.7925
0.1596 48.0 1356 0.7831 0.8175
0.1237 48.9912 1384 0.7949 0.8
0.1355 49.9823 1412 0.7813 0.795
0.1251 50.9735 1440 0.7647 0.81
0.1181 52.0 1469 0.7552 0.8175
0.1224 52.9912 1497 0.8346 0.795
0.1201 53.9823 1525 0.7741 0.7975
0.1109 54.9735 1553 0.7724 0.785
0.1084 56.0 1582 0.7904 0.805
0.1187 56.9912 1610 0.7424 0.8125
0.0935 57.9823 1638 0.7411 0.815
0.1023 58.9735 1666 0.7476 0.81
0.1166 60.0 1695 0.7742 0.8175
0.099 60.9912 1723 0.7697 0.815
0.1157 61.9823 1751 0.8538 0.8
0.1137 62.9735 1779 0.8545 0.8125
0.094 64.0 1808 0.8463 0.7925
0.1161 64.9912 1836 0.8351 0.81
0.08 65.9823 1864 0.8610 0.7925
0.0799 66.9735 1892 0.8593 0.8075
0.0783 68.0 1921 0.8423 0.815
0.0851 68.9912 1949 0.8265 0.82
0.0775 69.9823 1977 0.8708 0.805
0.0902 70.9735 2005 0.8181 0.81
0.0904 72.0 2034 0.8297 0.82
0.0898 72.9912 2062 0.8464 0.82
0.1013 73.9823 2090 0.8325 0.81
0.0726 74.9735 2118 0.8772 0.8
0.0745 76.0 2147 0.8505 0.8125
0.0891 76.9912 2175 0.8694 0.81
0.0791 77.9823 2203 0.8766 0.81
0.0639 78.9735 2231 0.8462 0.8125
0.0676 80.0 2260 0.8991 0.8075
0.0904 80.9912 2288 0.8551 0.815
0.0788 81.9823 2316 0.9302 0.795
0.0787 82.9735 2344 0.8706 0.8025
0.0918 84.0 2373 0.8680 0.805
0.0681 84.9912 2401 0.8481 0.8125
0.115 85.9823 2429 0.8553 0.8025
0.0599 86.9735 2457 0.8887 0.805
0.0774 88.0 2486 0.9255 0.81
0.0701 88.9912 2514 0.8795 0.81
0.074 89.9823 2542 0.8634 0.8175
0.0497 90.9735 2570 0.8793 0.82
0.0569 92.0 2599 0.9007 0.7925
0.0722 92.9912 2627 0.8701 0.815
0.0674 93.9823 2655 0.8880 0.8225
0.0643 94.9735 2683 0.8855 0.8075
0.0583 96.0 2712 0.8918 0.815
0.0558 96.9912 2740 0.8736 0.8275
0.0622 97.9823 2768 0.9058 0.815
0.0689 98.9735 2796 0.9007 0.8075
0.0782 100.0 2825 0.9216 0.8025
0.0696 100.9912 2853 0.9159 0.8075
0.0554 101.9823 2881 0.9195 0.8125
0.0585 102.9735 2909 0.9314 0.8125
0.0541 104.0 2938 0.8939 0.825
0.0636 104.9912 2966 0.9045 0.8025
0.0684 105.9823 2994 0.8892 0.8075
0.0608 106.9735 3022 0.8999 0.8075
0.0663 108.0 3051 0.9033 0.8075
0.054 108.9912 3079 0.9249 0.805
0.0538 109.9823 3107 0.9065 0.81
0.0696 110.9735 3135 0.9002 0.8175
0.0585 112.0 3164 0.9106 0.8025
0.0641 112.9912 3192 0.9088 0.81
0.0611 113.9823 3220 0.9152 0.8075
0.0528 114.9735 3248 0.9140 0.8125
0.0631 116.0 3277 0.9184 0.81
0.0744 116.9912 3305 0.9216 0.8125
0.0407 117.9823 3333 0.9211 0.8125
0.0573 118.9381 3360 0.9202 0.81

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1