File size: 5,536 Bytes
651d2ec
dc57335
 
 
 
 
 
 
 
 
651d2ec
 
dc57335
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
---
license: other
base_model: nvidia/mit-b1
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b1-finetuned-segments-graffiti
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b1-finetuned-segments-graffiti

This model is a fine-tuned version of [nvidia/mit-b1](https://huggingface.co/nvidia/mit-b1) on the Adriatogi/graffiti dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2171
- Mean Iou: 0.8381
- Mean Accuracy: 0.9102
- Overall Accuracy: 0.9168
- Accuracy Not Graf: 0.9379
- Accuracy Graf: 0.8826
- Iou Not Graf: 0.8748
- Iou Graf: 0.8015

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Not Graf | Accuracy Graf | Iou Not Graf | Iou Graf |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-----------------:|:-------------:|:------------:|:--------:|
| 0.4076        | 0.42  | 20   | 0.5389          | 0.6053   | 0.7982        | 0.7541           | 0.6139            | 0.9825        | 0.6073       | 0.6033   |
| 0.3386        | 0.83  | 40   | 0.2883          | 0.7962   | 0.8984        | 0.8898           | 0.8625            | 0.9343        | 0.8290       | 0.7634   |
| 0.1964        | 1.25  | 60   | 0.2514          | 0.8061   | 0.9009        | 0.8964           | 0.8819            | 0.9200        | 0.8406       | 0.7716   |
| 0.1723        | 1.67  | 80   | 0.2259          | 0.8269   | 0.9058        | 0.9100           | 0.9235            | 0.8880        | 0.8641       | 0.7898   |
| 0.1981        | 2.08  | 100  | 0.2338          | 0.8119   | 0.9040        | 0.8999           | 0.8869            | 0.9210        | 0.8459       | 0.7778   |
| 0.2827        | 2.5   | 120  | 0.2106          | 0.8251   | 0.9080        | 0.9084           | 0.9095            | 0.9066        | 0.8601       | 0.7902   |
| 0.1864        | 2.92  | 140  | 0.2241          | 0.8232   | 0.8956        | 0.9097           | 0.9546            | 0.8365        | 0.8675       | 0.7790   |
| 0.1362        | 3.33  | 160  | 0.2185          | 0.8257   | 0.8978        | 0.9109           | 0.9525            | 0.8431        | 0.8688       | 0.7826   |
| 0.1264        | 3.75  | 180  | 0.2155          | 0.8237   | 0.9054        | 0.9079           | 0.9156            | 0.8952        | 0.8602       | 0.7871   |
| 0.1688        | 4.17  | 200  | 0.2241          | 0.8206   | 0.8985        | 0.9072           | 0.9346            | 0.8625        | 0.8618       | 0.7795   |
| 0.1198        | 4.58  | 220  | 0.2080          | 0.8331   | 0.9087        | 0.9137           | 0.9296            | 0.8877        | 0.8697       | 0.7965   |
| 0.111         | 5.0   | 240  | 0.2033          | 0.8369   | 0.9133        | 0.9154           | 0.9221            | 0.9044        | 0.8710       | 0.8027   |
| 0.2003        | 5.42  | 260  | 0.2214          | 0.8262   | 0.9118        | 0.9084           | 0.8976            | 0.9261        | 0.8586       | 0.7938   |
| 0.1369        | 5.83  | 280  | 0.2044          | 0.8396   | 0.9147        | 0.9170           | 0.9245            | 0.9048        | 0.8734       | 0.8058   |
| 0.1901        | 6.25  | 300  | 0.1968          | 0.8411   | 0.9119        | 0.9185           | 0.9393            | 0.8846        | 0.8771       | 0.8050   |
| 0.1887        | 6.67  | 320  | 0.2098          | 0.8367   | 0.9100        | 0.9159           | 0.9344            | 0.8857        | 0.8731       | 0.8002   |
| 0.0738        | 7.08  | 340  | 0.2205          | 0.8357   | 0.9127        | 0.9147           | 0.9211            | 0.9043        | 0.8699       | 0.8014   |
| 0.1166        | 7.5   | 360  | 0.2274          | 0.8317   | 0.9046        | 0.9135           | 0.9420            | 0.8672        | 0.8709       | 0.7924   |
| 0.1247        | 7.92  | 380  | 0.2225          | 0.8310   | 0.9051        | 0.9130           | 0.9381            | 0.8722        | 0.8698       | 0.7923   |
| 0.1212        | 8.33  | 400  | 0.2230          | 0.8345   | 0.9108        | 0.9143           | 0.9254            | 0.8961        | 0.8699       | 0.7991   |
| 0.0979        | 8.75  | 420  | 0.2226          | 0.8352   | 0.9076        | 0.9153           | 0.9400            | 0.8752        | 0.8730       | 0.7973   |
| 0.0984        | 9.17  | 440  | 0.2189          | 0.8354   | 0.9106        | 0.9149           | 0.9287            | 0.8925        | 0.8712       | 0.7997   |
| 0.1151        | 9.58  | 460  | 0.2185          | 0.8382   | 0.9098        | 0.9170           | 0.9396            | 0.8800        | 0.8751       | 0.8013   |
| 0.0989        | 10.0  | 480  | 0.2171          | 0.8381   | 0.9102        | 0.9168           | 0.9379            | 0.8826        | 0.8748       | 0.8015   |


### Framework versions

- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2