smids_10x_deit_tiny_sgd_0001_fold4

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4161
  • Accuracy: 0.8417

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0688 1.0 750 1.0895 0.3983
0.9867 2.0 1500 0.9975 0.4833
0.9278 3.0 2250 0.9175 0.54
0.8528 4.0 3000 0.8474 0.5783
0.714 5.0 3750 0.7808 0.635
0.6866 6.0 4500 0.7206 0.6917
0.6101 7.0 5250 0.6718 0.7167
0.6452 8.0 6000 0.6299 0.7333
0.6113 9.0 6750 0.5989 0.7467
0.5055 10.0 7500 0.5743 0.7633
0.4983 11.0 8250 0.5553 0.7717
0.5538 12.0 9000 0.5377 0.7733
0.4959 13.0 9750 0.5236 0.7833
0.4737 14.0 10500 0.5129 0.7867
0.4376 15.0 11250 0.5024 0.7967
0.3926 16.0 12000 0.4941 0.8033
0.397 17.0 12750 0.4866 0.805
0.4304 18.0 13500 0.4793 0.8067
0.4526 19.0 14250 0.4737 0.81
0.4267 20.0 15000 0.4680 0.81
0.3746 21.0 15750 0.4626 0.8183
0.4237 22.0 16500 0.4581 0.815
0.4022 23.0 17250 0.4540 0.82
0.465 24.0 18000 0.4503 0.8233
0.3585 25.0 18750 0.4464 0.8267
0.3671 26.0 19500 0.4431 0.8267
0.3889 27.0 20250 0.4400 0.8283
0.3836 28.0 21000 0.4372 0.83
0.3751 29.0 21750 0.4351 0.83
0.3772 30.0 22500 0.4334 0.8333
0.3959 31.0 23250 0.4312 0.8333
0.3701 32.0 24000 0.4290 0.8317
0.3441 33.0 24750 0.4274 0.8317
0.371 34.0 25500 0.4262 0.8333
0.327 35.0 26250 0.4246 0.8333
0.3799 36.0 27000 0.4233 0.8367
0.3186 37.0 27750 0.4226 0.835
0.3955 38.0 28500 0.4215 0.835
0.4171 39.0 29250 0.4206 0.835
0.4116 40.0 30000 0.4196 0.8367
0.369 41.0 30750 0.4189 0.8383
0.3461 42.0 31500 0.4184 0.8383
0.3837 43.0 32250 0.4178 0.8417
0.3565 44.0 33000 0.4174 0.8417
0.3745 45.0 33750 0.4170 0.8417
0.3413 46.0 34500 0.4167 0.8417
0.301 47.0 35250 0.4164 0.8417
0.3105 48.0 36000 0.4162 0.8417
0.3511 49.0 36750 0.4161 0.8417
0.3221 50.0 37500 0.4161 0.8417

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
5
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for hkivancoral/smids_10x_deit_tiny_sgd_0001_fold4

Finetuned
(353)
this model

Evaluation results