smids_1x_deit_tiny_rms_001_fold2
This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 1.2712
- Accuracy: 0.7554
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.1905 | 1.0 | 75 | 1.0709 | 0.4526 |
1.1189 | 2.0 | 150 | 0.9655 | 0.5241 |
0.9756 | 3.0 | 225 | 0.9009 | 0.4626 |
0.9398 | 4.0 | 300 | 0.8912 | 0.5757 |
0.9083 | 5.0 | 375 | 0.8740 | 0.5191 |
0.9033 | 6.0 | 450 | 0.8827 | 0.5092 |
0.8824 | 7.0 | 525 | 0.7939 | 0.5824 |
0.9323 | 8.0 | 600 | 0.8247 | 0.5524 |
0.8419 | 9.0 | 675 | 0.8138 | 0.5807 |
0.8164 | 10.0 | 750 | 0.8013 | 0.5674 |
0.859 | 11.0 | 825 | 0.8565 | 0.5458 |
0.7125 | 12.0 | 900 | 0.8056 | 0.6173 |
0.803 | 13.0 | 975 | 0.8062 | 0.6190 |
0.7687 | 14.0 | 1050 | 0.7699 | 0.6040 |
0.7356 | 15.0 | 1125 | 0.7299 | 0.6589 |
0.7325 | 16.0 | 1200 | 0.7012 | 0.6722 |
0.6848 | 17.0 | 1275 | 0.7155 | 0.6423 |
0.6885 | 18.0 | 1350 | 0.6752 | 0.6689 |
0.6698 | 19.0 | 1425 | 0.6669 | 0.6872 |
0.7037 | 20.0 | 1500 | 0.6890 | 0.6789 |
0.6983 | 21.0 | 1575 | 0.7382 | 0.6423 |
0.6127 | 22.0 | 1650 | 0.6732 | 0.7038 |
0.6242 | 23.0 | 1725 | 0.6106 | 0.7304 |
0.658 | 24.0 | 1800 | 0.6268 | 0.7121 |
0.5546 | 25.0 | 1875 | 0.6631 | 0.7088 |
0.5765 | 26.0 | 1950 | 0.6682 | 0.6988 |
0.6162 | 27.0 | 2025 | 0.6203 | 0.7304 |
0.5296 | 28.0 | 2100 | 0.6174 | 0.7438 |
0.5276 | 29.0 | 2175 | 0.5823 | 0.7371 |
0.4954 | 30.0 | 2250 | 0.7129 | 0.6922 |
0.5509 | 31.0 | 2325 | 0.6075 | 0.7404 |
0.4629 | 32.0 | 2400 | 0.6387 | 0.7488 |
0.4323 | 33.0 | 2475 | 0.6167 | 0.7421 |
0.4094 | 34.0 | 2550 | 0.6489 | 0.7637 |
0.419 | 35.0 | 2625 | 0.6362 | 0.7371 |
0.444 | 36.0 | 2700 | 0.6255 | 0.7621 |
0.4294 | 37.0 | 2775 | 0.6272 | 0.7604 |
0.3866 | 38.0 | 2850 | 0.6218 | 0.7770 |
0.3776 | 39.0 | 2925 | 0.6660 | 0.7637 |
0.3382 | 40.0 | 3000 | 0.7027 | 0.7720 |
0.3406 | 41.0 | 3075 | 0.7627 | 0.7770 |
0.3115 | 42.0 | 3150 | 0.7813 | 0.7737 |
0.2146 | 43.0 | 3225 | 0.8652 | 0.7521 |
0.2529 | 44.0 | 3300 | 0.9528 | 0.7504 |
0.1582 | 45.0 | 3375 | 0.9733 | 0.7704 |
0.1575 | 46.0 | 3450 | 1.0460 | 0.7704 |
0.1254 | 47.0 | 3525 | 1.1356 | 0.7737 |
0.076 | 48.0 | 3600 | 1.2020 | 0.7604 |
0.0433 | 49.0 | 3675 | 1.2652 | 0.7571 |
0.0823 | 50.0 | 3750 | 1.2712 | 0.7554 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 8
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for hkivancoral/smids_1x_deit_tiny_rms_001_fold2
Base model
facebook/deit-tiny-patch16-224