swinv2-tiny-patch4-window8-256-dmae-humeda-DAV36

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6972
  • Accuracy: 0.68

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy
3.214 1.0 22 1.4507 0.46
2.298 2.0 44 1.0632 0.62
0.9579 3.0 66 1.1191 0.64
0.4479 4.0 88 0.9825 0.6467
0.1963 5.0 110 1.2844 0.6467
0.1663 6.0 132 1.2373 0.6667
0.1188 7.0 154 1.4338 0.6933
0.0526 8.0 176 1.6726 0.7133
0.01 9.0 198 2.5248 0.6267
0.028 10.0 220 2.6156 0.6467
0.0296 11.0 242 2.8334 0.6533
0.0074 12.0 264 2.2200 0.6867
0.0022 13.0 286 2.2802 0.7467
0.0225 14.0 308 2.1764 0.6933
0.0058 15.0 330 3.0594 0.62
0.0075 16.0 352 3.2166 0.6333
0.0163 17.0 374 2.4014 0.6933
0.0033 18.0 396 2.9112 0.6733
0.0036 19.0 418 2.8147 0.6533
0.0033 20.0 440 2.7731 0.6733
0.0161 21.0 462 2.0340 0.7467
0.0012 22.0 484 2.4596 0.6867
0.0009 23.0 506 2.7352 0.6667
0.0101 24.0 528 2.8204 0.6667
0.0011 25.0 550 2.8091 0.6733
0.0005 26.0 572 2.8126 0.6667
0.0007 27.0 594 2.7742 0.6733
0.0004 28.0 616 2.7208 0.6733
0.0025 29.0 638 2.7022 0.6733
0.0025 30.0 660 2.6972 0.68

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
8
Safetensors
Model size
27.6M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV36

Finetuned
(110)
this model