File size: 4,445 Bytes
d8fa9b7
 
 
 
5839179
 
d8fa9b7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5839179
d8fa9b7
5839179
 
 
 
 
d8fa9b7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
---
license: apache-2.0
base_model: facebook/convnextv2-large-1k-224
tags:
- image-classification
- vision
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: convnextv2-large-1k-224-finetuned-galaxy10-decals
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# convnextv2-large-1k-224-finetuned-galaxy10-decals

This model is a fine-tuned version of [facebook/convnextv2-large-1k-224](https://huggingface.co/facebook/convnextv2-large-1k-224) on the matthieulel/galaxy10_decals dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4479
- Accuracy: 0.8681
- Precision: 0.8670
- Recall: 0.8681
- F1: 0.8668

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
| 1.9261        | 0.99  | 62   | 1.8153          | 0.4696   | 0.5070    | 0.4696 | 0.3875 |
| 1.2684        | 2.0   | 125  | 1.1432          | 0.6793   | 0.6395    | 0.6793 | 0.6478 |
| 0.9177        | 2.99  | 187  | 0.7477          | 0.7847   | 0.7832    | 0.7847 | 0.7720 |
| 0.6937        | 4.0   | 250  | 0.5962          | 0.8168   | 0.8145    | 0.8168 | 0.8104 |
| 0.5937        | 4.99  | 312  | 0.5862          | 0.8191   | 0.8234    | 0.8191 | 0.8167 |
| 0.5921        | 6.0   | 375  | 0.5389          | 0.8365   | 0.8454    | 0.8365 | 0.8300 |
| 0.557         | 6.99  | 437  | 0.4944          | 0.8433   | 0.8478    | 0.8433 | 0.8410 |
| 0.5522        | 8.0   | 500  | 0.5022          | 0.8427   | 0.8508    | 0.8427 | 0.8416 |
| 0.5028        | 8.99  | 562  | 0.4481          | 0.8579   | 0.8610    | 0.8579 | 0.8580 |
| 0.4801        | 10.0  | 625  | 0.4360          | 0.8551   | 0.8536    | 0.8551 | 0.8527 |
| 0.4475        | 10.99 | 687  | 0.4663          | 0.8410   | 0.8423    | 0.8410 | 0.8407 |
| 0.411         | 12.0  | 750  | 0.4444          | 0.8546   | 0.8552    | 0.8546 | 0.8538 |
| 0.4173        | 12.99 | 812  | 0.4341          | 0.8613   | 0.8627    | 0.8613 | 0.8595 |
| 0.3995        | 14.0  | 875  | 0.4380          | 0.8653   | 0.8655    | 0.8653 | 0.8637 |
| 0.3657        | 14.99 | 937  | 0.4659          | 0.8625   | 0.8633    | 0.8625 | 0.8615 |
| 0.3533        | 16.0  | 1000 | 0.4600          | 0.8602   | 0.8592    | 0.8602 | 0.8585 |
| 0.3001        | 16.99 | 1062 | 0.5069          | 0.8478   | 0.8455    | 0.8478 | 0.8450 |
| 0.318         | 18.0  | 1125 | 0.4647          | 0.8574   | 0.8576    | 0.8574 | 0.8552 |
| 0.3029        | 18.99 | 1187 | 0.4479          | 0.8681   | 0.8670    | 0.8681 | 0.8668 |
| 0.2915        | 20.0  | 1250 | 0.4772          | 0.8625   | 0.8598    | 0.8625 | 0.8586 |
| 0.2742        | 20.99 | 1312 | 0.4798          | 0.8557   | 0.8538    | 0.8557 | 0.8521 |
| 0.3067        | 22.0  | 1375 | 0.4767          | 0.8602   | 0.8573    | 0.8602 | 0.8575 |
| 0.2758        | 22.99 | 1437 | 0.5099          | 0.8506   | 0.8547    | 0.8506 | 0.8516 |
| 0.2527        | 24.0  | 1500 | 0.5016          | 0.8585   | 0.8563    | 0.8585 | 0.8565 |
| 0.253         | 24.99 | 1562 | 0.4990          | 0.8625   | 0.8605    | 0.8625 | 0.8604 |
| 0.2361        | 26.0  | 1625 | 0.4903          | 0.8602   | 0.8590    | 0.8602 | 0.8591 |
| 0.2325        | 26.99 | 1687 | 0.5062          | 0.8602   | 0.8612    | 0.8602 | 0.8600 |
| 0.2448        | 28.0  | 1750 | 0.4997          | 0.8670   | 0.8648    | 0.8670 | 0.8646 |
| 0.2354        | 28.99 | 1812 | 0.4956          | 0.8608   | 0.8586    | 0.8608 | 0.8590 |
| 0.2156        | 29.76 | 1860 | 0.4970          | 0.8630   | 0.8615    | 0.8630 | 0.8617 |


### Framework versions

- Transformers 4.37.2
- Pytorch 2.3.0
- Datasets 2.19.1
- Tokenizers 0.15.1