davanstrien's picture
davanstrien HF staff
update model card README.md
6daf15d
|
raw
history blame
3.86 kB
metadata
license: apache-2.0
tags:
  - image-classification
  - vision
  - generated_from_trainer
metrics:
  - f1
model-index:
  - name: vit-base-patch32-224-in21-leicester_binary
    results: []

vit-base-patch32-224-in21-leicester_binary

This model is a fine-tuned version of google/vit-base-patch32-224-in21k on the davanstrien/leicester_loaded_annotations_binary dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0949
  • F1: 0.9747

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 128
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 40.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1
No log 1.0 7 0.4576 0.8608
0.5021 2.0 14 0.3953 0.8608
0.3595 3.0 21 0.3809 0.8608
0.3595 4.0 28 0.3286 0.8608
0.3009 5.0 35 0.2945 0.8608
0.2843 6.0 42 0.3528 0.8608
0.2843 7.0 49 0.2345 0.8608
0.266 8.0 56 0.2499 0.8608
0.222 9.0 63 0.2544 0.8608
0.2018 10.0 70 0.1954 0.8608
0.2018 11.0 77 0.2351 0.8608
0.1948 12.0 84 0.1705 0.8608
0.2053 13.0 91 0.1625 0.8734
0.2053 14.0 98 0.1719 0.9367
0.1729 15.0 105 0.1489 0.9367
0.1535 16.0 112 0.1450 0.9494
0.1535 17.0 119 0.1750 0.9494
0.1492 18.0 126 0.1514 0.9494
0.1349 19.0 133 0.1304 0.9620
0.1538 20.0 140 0.1291 0.9620
0.1538 21.0 147 0.1306 0.9620
0.1357 22.0 154 0.1283 0.9620
0.147 23.0 161 0.1289 0.9494
0.147 24.0 168 0.1339 0.9747
0.1388 25.0 175 0.1244 0.9494
0.1192 26.0 182 0.1117 0.9747
0.1192 27.0 189 0.1105 0.9873
0.112 28.0 196 0.1079 0.9747
0.1215 29.0 203 0.1151 0.9620
0.1139 30.0 210 0.1008 0.9873
0.1139 31.0 217 0.1033 0.9747
0.1164 32.0 224 0.0985 0.9873
0.1192 33.0 231 0.0955 0.9873
0.1192 34.0 238 0.1077 0.9620
0.1132 35.0 245 0.1107 0.9620
0.1021 36.0 252 0.0958 0.9873
0.1021 37.0 259 0.0957 0.9873
0.0945 38.0 266 0.0951 0.9747
0.1244 39.0 273 0.0949 0.9747
0.1012 40.0 280 0.0955 0.9873

Framework versions

  • Transformers 4.26.0.dev0
  • Pytorch 1.12.1+cu113
  • Datasets 2.7.1
  • Tokenizers 0.13.2