vishalkatheriya18's picture
End of training
69f744d verified
|
raw
history blame
10.5 kB
metadata
license: apache-2.0
base_model: facebook/convnextv2-tiny-1k-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
  - precision
  - recall
model-index:
  - name: convnextv2-tiny-1k-224-finetuned-neck-style
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8492753623188406
          - name: Precision
            type: precision
            value: 0.8507158478342087
          - name: Recall
            type: recall
            value: 0.8492753623188406

convnextv2-tiny-1k-224-finetuned-neck-style

This model is a fine-tuned version of facebook/convnextv2-tiny-1k-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6084
  • Accuracy: 0.8493
  • Precision: 0.8507
  • Recall: 0.8493

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall
1.613 0.9897 24 1.5833 0.2928 0.3248 0.2928
1.5494 1.9794 48 1.4944 0.3681 0.4410 0.3681
1.3989 2.9691 72 1.3424 0.5159 0.5262 0.5159
1.2238 4.0 97 1.1162 0.6261 0.6666 0.6261
0.9585 4.9897 121 0.8966 0.6986 0.7014 0.6986
0.8934 5.9794 145 0.7638 0.7507 0.7490 0.7507
0.7589 6.9691 169 0.6776 0.7652 0.7719 0.7652
0.6746 8.0 194 0.6127 0.7623 0.7628 0.7623
0.6048 8.9897 218 0.5221 0.8203 0.8217 0.8203
0.531 9.9794 242 0.4931 0.8116 0.8204 0.8116
0.57 10.9691 266 0.4480 0.8319 0.8345 0.8319
0.4624 12.0 291 0.4214 0.8464 0.8460 0.8464
0.417 12.9897 315 0.4439 0.8493 0.8486 0.8493
0.3814 13.9794 339 0.4138 0.8464 0.8478 0.8464
0.3737 14.9691 363 0.4139 0.8464 0.8466 0.8464
0.3971 16.0 388 0.4119 0.8638 0.8665 0.8638
0.343 16.9897 412 0.4421 0.8609 0.8659 0.8609
0.3311 17.9794 436 0.4581 0.8493 0.8504 0.8493
0.2652 18.9691 460 0.4563 0.8406 0.8441 0.8406
0.3026 20.0 485 0.4536 0.8522 0.8549 0.8522
0.2562 20.9897 509 0.4409 0.8464 0.8493 0.8464
0.2282 21.9794 533 0.4389 0.8435 0.8451 0.8435
0.2374 22.9691 557 0.4452 0.8580 0.8589 0.8580
0.216 24.0 582 0.4375 0.8580 0.8581 0.8580
0.2127 24.9897 606 0.4422 0.8580 0.8588 0.8580
0.2004 25.9794 630 0.4635 0.8522 0.8519 0.8522
0.2029 26.9691 654 0.5215 0.8493 0.8546 0.8493
0.1794 28.0 679 0.4756 0.8638 0.8669 0.8638
0.1835 28.9897 703 0.4728 0.8609 0.8650 0.8609
0.1781 29.9794 727 0.4637 0.8551 0.8568 0.8551
0.1671 30.9691 751 0.4856 0.8580 0.8599 0.8580
0.1762 32.0 776 0.5008 0.8667 0.8684 0.8667
0.1867 32.9897 800 0.5058 0.8580 0.8585 0.8580
0.1409 33.9794 824 0.5490 0.8406 0.8409 0.8406
0.1315 34.9691 848 0.5284 0.8348 0.8356 0.8348
0.1315 36.0 873 0.5415 0.8464 0.8488 0.8464
0.1974 36.9897 897 0.5194 0.8493 0.8536 0.8493
0.1337 37.9794 921 0.5088 0.8609 0.8603 0.8609
0.173 38.9691 945 0.4912 0.8667 0.8680 0.8667
0.1409 40.0 970 0.5223 0.8493 0.8502 0.8493
0.1379 40.9897 994 0.5204 0.8493 0.8487 0.8493
0.1437 41.9794 1018 0.5860 0.8522 0.8551 0.8522
0.1022 42.9691 1042 0.5461 0.8464 0.8492 0.8464
0.1181 44.0 1067 0.5411 0.8551 0.8566 0.8551
0.1212 44.9897 1091 0.5294 0.8580 0.8580 0.8580
0.1049 45.9794 1115 0.5667 0.8493 0.8492 0.8493
0.1132 46.9691 1139 0.5908 0.8464 0.8491 0.8464
0.1313 48.0 1164 0.5996 0.8522 0.8582 0.8522
0.1312 48.9897 1188 0.5430 0.8580 0.8607 0.8580
0.0996 49.9794 1212 0.5777 0.8522 0.8561 0.8522
0.1389 50.9691 1236 0.5758 0.8435 0.8486 0.8435
0.1079 52.0 1261 0.5540 0.8580 0.8611 0.8580
0.0972 52.9897 1285 0.5600 0.8551 0.8559 0.8551
0.0985 53.9794 1309 0.5392 0.8638 0.8656 0.8638
0.1112 54.9691 1333 0.5411 0.8638 0.8656 0.8638
0.1308 56.0 1358 0.5445 0.8638 0.8654 0.8638
0.1005 56.9897 1382 0.5554 0.8551 0.8551 0.8551
0.0871 57.9794 1406 0.5966 0.8406 0.8441 0.8406
0.1102 58.9691 1430 0.5807 0.8522 0.8543 0.8522
0.1028 60.0 1455 0.5654 0.8435 0.8491 0.8435
0.107 60.9897 1479 0.5779 0.8435 0.8461 0.8435
0.0848 61.9794 1503 0.5843 0.8551 0.8569 0.8551
0.0976 62.9691 1527 0.6162 0.8435 0.8454 0.8435
0.0977 64.0 1552 0.5822 0.8464 0.8469 0.8464
0.1256 64.9897 1576 0.5757 0.8493 0.8514 0.8493
0.0883 65.9794 1600 0.5716 0.8464 0.8467 0.8464
0.0808 66.9691 1624 0.5726 0.8551 0.8562 0.8551
0.1034 68.0 1649 0.5413 0.8551 0.8549 0.8551
0.0845 68.9897 1673 0.5826 0.8435 0.8477 0.8435
0.0916 69.9794 1697 0.5661 0.8522 0.8522 0.8522
0.0912 70.9691 1721 0.5771 0.8493 0.8498 0.8493
0.0863 72.0 1746 0.5769 0.8551 0.8550 0.8551
0.083 72.9897 1770 0.5860 0.8493 0.8486 0.8493
0.0839 73.9794 1794 0.5647 0.8551 0.8551 0.8551
0.0903 74.9691 1818 0.6012 0.8551 0.8535 0.8551
0.074 76.0 1843 0.6048 0.8464 0.8461 0.8464
0.0907 76.9897 1867 0.5807 0.8493 0.8495 0.8493
0.0613 77.9794 1891 0.5775 0.8377 0.8382 0.8377
0.0964 78.9691 1915 0.5759 0.8667 0.8676 0.8667
0.0735 80.0 1940 0.5962 0.8551 0.8566 0.8551
0.0663 80.9897 1964 0.5769 0.8435 0.8441 0.8435
0.0719 81.9794 1988 0.5826 0.8493 0.8507 0.8493
0.0718 82.9691 2012 0.5880 0.8580 0.8590 0.8580
0.0925 84.0 2037 0.5986 0.8493 0.8513 0.8493
0.0621 84.9897 2061 0.5915 0.8493 0.8497 0.8493
0.059 85.9794 2085 0.5779 0.8580 0.8577 0.8580
0.0806 86.9691 2109 0.5928 0.8493 0.8501 0.8493
0.0617 88.0 2134 0.6062 0.8522 0.8520 0.8522
0.0651 88.9897 2158 0.6067 0.8522 0.8519 0.8522
0.0754 89.9794 2182 0.6108 0.8551 0.8553 0.8551
0.0682 90.9691 2206 0.6185 0.8493 0.8489 0.8493
0.0763 92.0 2231 0.6168 0.8580 0.8575 0.8580
0.0703 92.9897 2255 0.6259 0.8522 0.8521 0.8522
0.0861 93.9794 2279 0.6128 0.8551 0.8553 0.8551
0.0807 94.9691 2303 0.6140 0.8551 0.8547 0.8551
0.0621 96.0 2328 0.6133 0.8522 0.8532 0.8522
0.0831 96.9897 2352 0.6101 0.8493 0.8507 0.8493
0.0625 97.9794 2376 0.6097 0.8493 0.8507 0.8493
0.0571 98.9691 2400 0.6084 0.8493 0.8507 0.8493

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1