File size: 4,455 Bytes
43a2d09
 
 
 
9cceb0b
 
43a2d09
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9cceb0b
43a2d09
9cceb0b
 
 
 
 
43a2d09
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
---
license: apache-2.0
base_model: google/vit-base-patch32-224-in21k
tags:
- image-classification
- vision
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: vit-base-patch32-224-in21k-finetuned-galaxy10-decals
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# vit-base-patch32-224-in21k-finetuned-galaxy10-decals

This model is a fine-tuned version of [google/vit-base-patch32-224-in21k](https://huggingface.co/google/vit-base-patch32-224-in21k) on the matthieulel/galaxy10_decals dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5180
- Accuracy: 0.8382
- Precision: 0.8363
- Recall: 0.8382
- F1: 0.8346

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
| 1.4731        | 0.99  | 124  | 1.3850          | 0.6110   | 0.5791    | 0.6110 | 0.5797 |
| 0.9858        | 2.0   | 249  | 0.8900          | 0.7508   | 0.7578    | 0.7508 | 0.7337 |
| 0.9475        | 3.0   | 374  | 0.7799          | 0.7599   | 0.7667    | 0.7599 | 0.7559 |
| 0.7778        | 4.0   | 499  | 0.6798          | 0.7779   | 0.7825    | 0.7779 | 0.7729 |
| 0.6831        | 4.99  | 623  | 0.6352          | 0.7914   | 0.7916    | 0.7914 | 0.7889 |
| 0.6953        | 6.0   | 748  | 0.5931          | 0.8044   | 0.8076    | 0.8044 | 0.8023 |
| 0.6725        | 7.0   | 873  | 0.7304          | 0.7537   | 0.7671    | 0.7537 | 0.7519 |
| 0.5648        | 8.0   | 998  | 0.6352          | 0.7909   | 0.7961    | 0.7909 | 0.7868 |
| 0.6127        | 8.99  | 1122 | 0.6087          | 0.7858   | 0.7879    | 0.7858 | 0.7820 |
| 0.529         | 10.0  | 1247 | 0.5827          | 0.8072   | 0.8074    | 0.8072 | 0.8041 |
| 0.5212        | 11.0  | 1372 | 0.5787          | 0.8179   | 0.8177    | 0.8179 | 0.8108 |
| 0.4665        | 12.0  | 1497 | 0.5597          | 0.8168   | 0.8213    | 0.8168 | 0.8134 |
| 0.5123        | 12.99 | 1621 | 0.5840          | 0.8044   | 0.8163    | 0.8044 | 0.8044 |
| 0.4918        | 14.0  | 1746 | 0.5592          | 0.8219   | 0.8221    | 0.8219 | 0.8195 |
| 0.4733        | 15.0  | 1871 | 0.5180          | 0.8382   | 0.8363    | 0.8382 | 0.8346 |
| 0.4552        | 16.0  | 1996 | 0.5673          | 0.8174   | 0.8181    | 0.8174 | 0.8153 |
| 0.4004        | 16.99 | 2120 | 0.5711          | 0.8224   | 0.8239    | 0.8224 | 0.8199 |
| 0.3359        | 18.0  | 2245 | 0.5813          | 0.8168   | 0.8153    | 0.8168 | 0.8147 |
| 0.4069        | 19.0  | 2370 | 0.5482          | 0.8343   | 0.8352    | 0.8343 | 0.8307 |
| 0.3783        | 20.0  | 2495 | 0.5658          | 0.8179   | 0.8169    | 0.8179 | 0.8150 |
| 0.3293        | 20.99 | 2619 | 0.5647          | 0.8247   | 0.8234    | 0.8247 | 0.8230 |
| 0.3214        | 22.0  | 2744 | 0.5654          | 0.8309   | 0.8289    | 0.8309 | 0.8293 |
| 0.3285        | 23.0  | 2869 | 0.5943          | 0.8213   | 0.8226    | 0.8213 | 0.8201 |
| 0.2934        | 24.0  | 2994 | 0.5931          | 0.8264   | 0.8287    | 0.8264 | 0.8259 |
| 0.3051        | 24.99 | 3118 | 0.5788          | 0.8309   | 0.8325    | 0.8309 | 0.8303 |
| 0.2911        | 26.0  | 3243 | 0.5700          | 0.8377   | 0.8354    | 0.8377 | 0.8358 |
| 0.2893        | 27.0  | 3368 | 0.5971          | 0.8286   | 0.8320    | 0.8286 | 0.8291 |
| 0.2794        | 28.0  | 3493 | 0.5908          | 0.8315   | 0.8307    | 0.8315 | 0.8303 |
| 0.2506        | 28.99 | 3617 | 0.5914          | 0.8309   | 0.8314    | 0.8309 | 0.8306 |
| 0.2421        | 29.82 | 3720 | 0.5861          | 0.8365   | 0.8366    | 0.8365 | 0.8359 |


### Framework versions

- Transformers 4.37.2
- Pytorch 2.3.0
- Datasets 2.19.1
- Tokenizers 0.15.1