hushem_5x_deit_tiny_sgd_0001_fold1

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5185
  • Accuracy: 0.2667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.5559 1.0 27 1.6659 0.2667
1.5307 2.0 54 1.6510 0.2667
1.5463 3.0 81 1.6380 0.2889
1.5241 4.0 108 1.6272 0.2889
1.4794 5.0 135 1.6169 0.2889
1.5071 6.0 162 1.6070 0.2889
1.4768 7.0 189 1.5986 0.2889
1.4869 8.0 216 1.5910 0.2889
1.4651 9.0 243 1.5844 0.3111
1.4396 10.0 270 1.5781 0.3111
1.4572 11.0 297 1.5728 0.3111
1.4029 12.0 324 1.5680 0.3111
1.4355 13.0 351 1.5638 0.3111
1.4582 14.0 378 1.5597 0.2889
1.4073 15.0 405 1.5561 0.2889
1.4381 16.0 432 1.5526 0.2889
1.4333 17.0 459 1.5495 0.2889
1.3978 18.0 486 1.5468 0.2889
1.3884 19.0 513 1.5441 0.2889
1.3796 20.0 540 1.5418 0.2889
1.4025 21.0 567 1.5397 0.2889
1.3822 22.0 594 1.5376 0.2889
1.3868 23.0 621 1.5359 0.2889
1.3907 24.0 648 1.5343 0.2889
1.38 25.0 675 1.5327 0.2667
1.3755 26.0 702 1.5313 0.2667
1.3485 27.0 729 1.5299 0.2667
1.3648 28.0 756 1.5287 0.2667
1.3797 29.0 783 1.5276 0.2667
1.3716 30.0 810 1.5265 0.2667
1.389 31.0 837 1.5256 0.2667
1.3813 32.0 864 1.5247 0.2667
1.3289 33.0 891 1.5240 0.2667
1.3517 34.0 918 1.5232 0.2667
1.3834 35.0 945 1.5225 0.2667
1.3458 36.0 972 1.5218 0.2667
1.3745 37.0 999 1.5212 0.2667
1.3761 38.0 1026 1.5207 0.2667
1.3726 39.0 1053 1.5203 0.2667
1.3125 40.0 1080 1.5199 0.2667
1.3599 41.0 1107 1.5196 0.2667
1.3277 42.0 1134 1.5193 0.2667
1.3748 43.0 1161 1.5191 0.2667
1.3689 44.0 1188 1.5188 0.2667
1.3379 45.0 1215 1.5187 0.2667
1.3358 46.0 1242 1.5186 0.2667
1.3497 47.0 1269 1.5185 0.2667
1.3482 48.0 1296 1.5185 0.2667
1.3616 49.0 1323 1.5185 0.2667
1.3216 50.0 1350 1.5185 0.2667

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
9
Safetensors
Model size
5.53M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for hkivancoral/hushem_5x_deit_tiny_sgd_0001_fold1

Finetuned
(353)
this model

Evaluation results