Swin-dmae-DA5-N-Colab
This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 1.4370
- Accuracy: 0.7812
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 120
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
6.8626 | 0.97 | 24 | 7.8026 | 0.125 |
6.6392 | 1.98 | 49 | 7.5823 | 0.125 |
6.3514 | 2.99 | 74 | 6.6653 | 0.125 |
5.3109 | 4.0 | 99 | 5.2798 | 0.125 |
3.7476 | 4.97 | 123 | 3.6186 | 0.125 |
2.7138 | 5.98 | 148 | 1.9377 | 0.125 |
1.4116 | 6.99 | 173 | 1.4680 | 0.125 |
1.3932 | 8.0 | 198 | 1.3819 | 0.5 |
1.2566 | 8.97 | 222 | 1.5912 | 0.1562 |
1.1332 | 9.98 | 247 | 1.3336 | 0.4375 |
0.9511 | 10.99 | 272 | 1.1492 | 0.2812 |
0.8905 | 12.0 | 297 | 1.0571 | 0.5312 |
0.8317 | 12.97 | 321 | 0.8474 | 0.6562 |
0.6611 | 13.98 | 346 | 0.7520 | 0.7188 |
0.5683 | 14.99 | 371 | 0.6656 | 0.75 |
0.569 | 16.0 | 396 | 0.8109 | 0.5312 |
0.4702 | 16.97 | 420 | 0.7036 | 0.625 |
0.4244 | 17.98 | 445 | 0.8169 | 0.6562 |
0.3483 | 18.99 | 470 | 0.7076 | 0.7188 |
0.3853 | 20.0 | 495 | 0.8644 | 0.7188 |
0.3038 | 20.97 | 519 | 0.8653 | 0.7188 |
0.2885 | 21.98 | 544 | 1.0444 | 0.7188 |
0.2014 | 22.99 | 569 | 1.0684 | 0.5938 |
0.2764 | 24.0 | 594 | 1.1422 | 0.6562 |
0.2493 | 24.97 | 618 | 1.1025 | 0.6875 |
0.2754 | 25.98 | 643 | 1.0371 | 0.7188 |
0.1793 | 26.99 | 668 | 1.1624 | 0.6562 |
0.1971 | 28.0 | 693 | 1.3177 | 0.6875 |
0.1881 | 28.97 | 717 | 1.2813 | 0.6875 |
0.167 | 29.98 | 742 | 1.5564 | 0.625 |
0.1872 | 30.99 | 767 | 1.3762 | 0.7188 |
0.1374 | 32.0 | 792 | 1.4407 | 0.625 |
0.1841 | 32.97 | 816 | 1.4038 | 0.6875 |
0.167 | 33.98 | 841 | 1.3769 | 0.6875 |
0.1614 | 34.99 | 866 | 1.5351 | 0.6562 |
0.1835 | 36.0 | 891 | 1.4466 | 0.6875 |
0.1917 | 36.97 | 915 | 1.3493 | 0.75 |
0.1171 | 37.98 | 940 | 1.4756 | 0.75 |
0.163 | 38.99 | 965 | 1.4373 | 0.6875 |
0.1688 | 40.0 | 990 | 1.4082 | 0.75 |
0.1318 | 40.97 | 1014 | 1.5907 | 0.6875 |
0.1107 | 41.98 | 1039 | 1.7462 | 0.6875 |
0.1064 | 42.99 | 1064 | 1.8704 | 0.5625 |
0.1423 | 44.0 | 1089 | 1.7155 | 0.5625 |
0.082 | 44.97 | 1113 | 1.5552 | 0.7188 |
0.1012 | 45.98 | 1138 | 1.4190 | 0.6875 |
0.1001 | 46.99 | 1163 | 1.6801 | 0.7188 |
0.1037 | 48.0 | 1188 | 1.6864 | 0.7188 |
0.1089 | 48.97 | 1212 | 1.5225 | 0.6875 |
0.0835 | 49.98 | 1237 | 1.9798 | 0.6875 |
0.0818 | 50.99 | 1262 | 1.7268 | 0.6562 |
0.1134 | 52.0 | 1287 | 1.5996 | 0.75 |
0.1115 | 52.97 | 1311 | 1.7281 | 0.6562 |
0.0929 | 53.98 | 1336 | 1.6346 | 0.75 |
0.0909 | 54.99 | 1361 | 1.4370 | 0.7812 |
0.1076 | 56.0 | 1386 | 1.5510 | 0.7812 |
0.0948 | 56.97 | 1410 | 1.6383 | 0.75 |
0.0914 | 57.98 | 1435 | 1.6938 | 0.6875 |
0.0598 | 58.99 | 1460 | 1.6291 | 0.75 |
0.0769 | 60.0 | 1485 | 1.6594 | 0.75 |
0.0894 | 60.97 | 1509 | 1.6302 | 0.7812 |
0.0999 | 61.98 | 1534 | 1.6562 | 0.7188 |
0.0759 | 62.99 | 1559 | 1.5989 | 0.75 |
0.102 | 64.0 | 1584 | 1.6602 | 0.7812 |
0.0864 | 64.97 | 1608 | 1.7386 | 0.7812 |
0.0722 | 65.98 | 1633 | 2.0495 | 0.7188 |
0.0956 | 66.99 | 1658 | 1.9749 | 0.6875 |
0.0698 | 68.0 | 1683 | 2.0090 | 0.6562 |
0.0635 | 68.97 | 1707 | 2.1600 | 0.625 |
0.0726 | 69.98 | 1732 | 1.8477 | 0.75 |
0.0905 | 70.99 | 1757 | 1.9970 | 0.7188 |
0.0955 | 72.0 | 1782 | 1.9001 | 0.75 |
0.0614 | 72.97 | 1806 | 1.9347 | 0.6562 |
0.0721 | 73.98 | 1831 | 1.9007 | 0.6875 |
0.0868 | 74.99 | 1856 | 2.0204 | 0.6562 |
0.0817 | 76.0 | 1881 | 1.9807 | 0.7188 |
0.0533 | 76.97 | 1905 | 1.9782 | 0.75 |
0.0682 | 77.98 | 1930 | 1.8320 | 0.75 |
0.078 | 78.99 | 1955 | 1.8351 | 0.7188 |
0.0991 | 80.0 | 1980 | 1.9694 | 0.7188 |
0.0601 | 80.97 | 2004 | 1.8795 | 0.7188 |
0.072 | 81.98 | 2029 | 2.0294 | 0.6562 |
0.0746 | 82.99 | 2054 | 1.8439 | 0.7188 |
0.0547 | 84.0 | 2079 | 1.9321 | 0.7188 |
0.0497 | 84.97 | 2103 | 1.8862 | 0.7812 |
0.0566 | 85.98 | 2128 | 2.0067 | 0.6562 |
0.0353 | 86.99 | 2153 | 2.0957 | 0.7188 |
0.0634 | 88.0 | 2178 | 2.1571 | 0.6562 |
0.0477 | 88.97 | 2202 | 2.0384 | 0.6875 |
0.0513 | 89.98 | 2227 | 1.9146 | 0.75 |
0.0717 | 90.99 | 2252 | 1.8838 | 0.7188 |
0.0644 | 92.0 | 2277 | 1.9186 | 0.6875 |
0.0848 | 92.97 | 2301 | 1.8828 | 0.7188 |
0.0393 | 93.98 | 2326 | 1.9442 | 0.7188 |
0.046 | 94.99 | 2351 | 1.8866 | 0.7188 |
0.0487 | 96.0 | 2376 | 1.9787 | 0.6875 |
0.074 | 96.97 | 2400 | 2.0081 | 0.6875 |
0.0435 | 97.98 | 2425 | 1.8839 | 0.75 |
0.0509 | 98.99 | 2450 | 1.9208 | 0.7188 |
0.0571 | 100.0 | 2475 | 1.9770 | 0.7188 |
0.0327 | 100.97 | 2499 | 1.9700 | 0.7188 |
0.0387 | 101.98 | 2524 | 1.9251 | 0.75 |
0.029 | 102.99 | 2549 | 1.9490 | 0.7188 |
0.0478 | 104.0 | 2574 | 1.9358 | 0.7188 |
0.0587 | 104.97 | 2598 | 1.9197 | 0.75 |
0.0523 | 105.98 | 2623 | 1.9309 | 0.7188 |
0.0581 | 106.99 | 2648 | 1.9829 | 0.7188 |
0.0352 | 108.0 | 2673 | 2.0047 | 0.6875 |
0.0373 | 108.97 | 2697 | 1.9897 | 0.7188 |
0.0258 | 109.98 | 2722 | 1.9384 | 0.7188 |
0.039 | 110.99 | 2747 | 1.9356 | 0.6875 |
0.0333 | 112.0 | 2772 | 1.9805 | 0.7188 |
0.0641 | 112.97 | 2796 | 1.9814 | 0.6875 |
0.0649 | 113.98 | 2821 | 1.9726 | 0.6875 |
0.0241 | 114.99 | 2846 | 1.9737 | 0.6875 |
0.0356 | 116.0 | 2871 | 1.9856 | 0.6875 |
0.0601 | 116.36 | 2880 | 1.9853 | 0.6875 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu118
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for Augusto777/Swin-dmae-DA5-N-Colab
Base model
microsoft/swinv2-tiny-patch4-window8-256Evaluation results
- Accuracy on imagefoldervalidation set self-reported0.781