smids_5x_deit_tiny_sgd_0001_fold2

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4980
  • Accuracy: 0.8003

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.1772 1.0 375 1.1487 0.3760
1.0767 2.0 750 1.0817 0.4276
0.9966 3.0 1125 1.0309 0.4742
0.9479 4.0 1500 0.9881 0.5158
0.9443 5.0 1875 0.9501 0.5374
0.9172 6.0 2250 0.9138 0.5574
0.8419 7.0 2625 0.8792 0.5857
0.8745 8.0 3000 0.8452 0.6023
0.8037 9.0 3375 0.8135 0.6273
0.7725 10.0 3750 0.7849 0.6556
0.73 11.0 4125 0.7582 0.6705
0.7472 12.0 4500 0.7325 0.6822
0.6618 13.0 4875 0.7082 0.7105
0.6872 14.0 5250 0.6883 0.7255
0.6348 15.0 5625 0.6702 0.7304
0.5716 16.0 6000 0.6521 0.7388
0.6306 17.0 6375 0.6373 0.7454
0.604 18.0 6750 0.6231 0.7537
0.5325 19.0 7125 0.6109 0.7604
0.602 20.0 7500 0.6001 0.7604
0.5143 21.0 7875 0.5907 0.7604
0.5394 22.0 8250 0.5814 0.7671
0.5319 23.0 8625 0.5730 0.7671
0.5182 24.0 9000 0.5659 0.7704
0.5353 25.0 9375 0.5592 0.7720
0.5323 26.0 9750 0.5527 0.7787
0.5286 27.0 10125 0.5472 0.7787
0.4649 28.0 10500 0.5420 0.7820
0.5396 29.0 10875 0.5373 0.7837
0.4863 30.0 11250 0.5330 0.7837
0.4735 31.0 11625 0.5292 0.7837
0.5595 32.0 12000 0.5255 0.7887
0.4938 33.0 12375 0.5219 0.7920
0.468 34.0 12750 0.5190 0.7970
0.4879 35.0 13125 0.5162 0.7970
0.4592 36.0 13500 0.5136 0.7970
0.4625 37.0 13875 0.5113 0.7953
0.4476 38.0 14250 0.5091 0.7970
0.4681 39.0 14625 0.5073 0.8003
0.4644 40.0 15000 0.5056 0.8003
0.4624 41.0 15375 0.5040 0.8003
0.4244 42.0 15750 0.5026 0.8003
0.457 43.0 16125 0.5015 0.8003
0.4802 44.0 16500 0.5005 0.8003
0.4285 45.0 16875 0.4996 0.8003
0.4665 46.0 17250 0.4990 0.8003
0.4627 47.0 17625 0.4985 0.8003
0.4554 48.0 18000 0.4982 0.8003
0.4888 49.0 18375 0.4980 0.8003
0.4643 50.0 18750 0.4980 0.8003

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.1+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
5
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for hkivancoral/smids_5x_deit_tiny_sgd_0001_fold2

Finetuned
(353)
this model

Evaluation results