vit-base-patch32-224-in21k-finetuned-galaxy10-decals
This model is a fine-tuned version of google/vit-base-patch32-224-in21k on the matthieulel/galaxy10_decals dataset. It achieves the following results on the evaluation set:
- Loss: 0.5055
- Accuracy: 0.8422
- Precision: 0.8413
- Recall: 0.8422
- F1: 0.8406
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|
2.0954 | 0.99 | 31 | 1.9514 | 0.3737 | 0.3322 | 0.3737 | 0.2452 |
1.4835 | 1.98 | 62 | 1.3284 | 0.6184 | 0.6016 | 0.6184 | 0.5878 |
1.1252 | 2.98 | 93 | 0.9771 | 0.7300 | 0.7400 | 0.7300 | 0.7004 |
0.9605 | 4.0 | 125 | 0.8374 | 0.7570 | 0.7754 | 0.7570 | 0.7368 |
0.8383 | 4.99 | 156 | 0.7286 | 0.7762 | 0.7728 | 0.7762 | 0.7650 |
0.7665 | 5.98 | 187 | 0.7256 | 0.7689 | 0.7683 | 0.7689 | 0.7586 |
0.7305 | 6.98 | 218 | 0.6640 | 0.7948 | 0.8031 | 0.7948 | 0.7966 |
0.6689 | 8.0 | 250 | 0.6792 | 0.7807 | 0.7859 | 0.7807 | 0.7708 |
0.6783 | 8.99 | 281 | 0.5985 | 0.8117 | 0.8076 | 0.8117 | 0.8071 |
0.6225 | 9.98 | 312 | 0.6118 | 0.8050 | 0.8036 | 0.8050 | 0.8025 |
0.6081 | 10.98 | 343 | 0.5966 | 0.8112 | 0.8108 | 0.8112 | 0.8080 |
0.6028 | 12.0 | 375 | 0.5708 | 0.8202 | 0.8239 | 0.8202 | 0.8199 |
0.6052 | 12.99 | 406 | 0.6035 | 0.8010 | 0.8116 | 0.8010 | 0.7982 |
0.5553 | 13.98 | 437 | 0.5542 | 0.8196 | 0.8199 | 0.8196 | 0.8143 |
0.5526 | 14.98 | 468 | 0.5385 | 0.8326 | 0.8346 | 0.8326 | 0.8317 |
0.5199 | 16.0 | 500 | 0.5298 | 0.8219 | 0.8192 | 0.8219 | 0.8172 |
0.4974 | 16.99 | 531 | 0.5291 | 0.8298 | 0.8306 | 0.8298 | 0.8260 |
0.5015 | 17.98 | 562 | 0.5244 | 0.8275 | 0.8280 | 0.8275 | 0.8267 |
0.4763 | 18.98 | 593 | 0.5190 | 0.8354 | 0.8357 | 0.8354 | 0.8316 |
0.4763 | 20.0 | 625 | 0.5241 | 0.8264 | 0.8286 | 0.8264 | 0.8249 |
0.4592 | 20.99 | 656 | 0.5061 | 0.8410 | 0.8439 | 0.8410 | 0.8406 |
0.4414 | 21.98 | 687 | 0.5207 | 0.8269 | 0.8265 | 0.8269 | 0.8260 |
0.4372 | 22.98 | 718 | 0.5342 | 0.8253 | 0.8283 | 0.8253 | 0.8254 |
0.4118 | 24.0 | 750 | 0.5256 | 0.8275 | 0.8291 | 0.8275 | 0.8274 |
0.4319 | 24.99 | 781 | 0.5055 | 0.8422 | 0.8413 | 0.8422 | 0.8406 |
0.3807 | 25.98 | 812 | 0.5187 | 0.8377 | 0.8375 | 0.8377 | 0.8361 |
0.4066 | 26.98 | 843 | 0.5203 | 0.8348 | 0.8333 | 0.8348 | 0.8326 |
0.376 | 28.0 | 875 | 0.5128 | 0.8365 | 0.8361 | 0.8365 | 0.8348 |
0.3992 | 28.99 | 906 | 0.5108 | 0.8377 | 0.8375 | 0.8377 | 0.8364 |
0.3743 | 29.76 | 930 | 0.5087 | 0.8388 | 0.8389 | 0.8388 | 0.8378 |
Framework versions
- Transformers 4.37.2
- Pytorch 2.3.0
- Datasets 2.19.1
- Tokenizers 0.15.1
- Downloads last month
- 10
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for matthieulel/vit-base-patch32-224-in21k-finetuned-galaxy10-decals
Base model
google/vit-base-patch32-224-in21k