turalchik commited on
Commit
500eac2
·
verified ·
1 Parent(s): da61453

End of training

Browse files
Files changed (1) hide show
  1. README.md +3 -57
README.md CHANGED
@@ -15,38 +15,6 @@ should probably proofread and complete it, then remove this comment. -->
15
  # detr-resnet-50-dc5-fashionpedia-finetuned
16
 
17
  This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
18
- It achieves the following results on the evaluation set:
19
- - Loss: 2.5675
20
- - Map: 0.0275
21
- - Map 50: 0.0634
22
- - Map 75: 0.0182
23
- - Map Small: 0.0
24
- - Map Medium: 0.032
25
- - Map Large: 0.0752
26
- - Mar 1: 0.0195
27
- - Mar 10: 0.0979
28
- - Mar 100: 0.1196
29
- - Mar Small: 0.0
30
- - Mar Medium: 0.1307
31
- - Mar Large: 0.3012
32
- - Map Aortic enlargement: 0.1155
33
- - Mar 100 Aortic enlargement: 0.4321
34
- - Map Cardiomegaly: 0.132
35
- - Mar 100 Cardiomegaly: 0.5943
36
- - Map Ild: 0.0
37
- - Mar 100 Ild: 0.0
38
- - Map Lung opacity: 0.0001
39
- - Mar 100 Lung opacity: 0.05
40
- - Map Nodule/mass: 0.0
41
- - Mar 100 Nodule/mass: 0.0
42
- - Map Other lesion: 0.0
43
- - Mar 100 Other lesion: 0.0
44
- - Map Pleural effusion: 0.0
45
- - Mar 100 Pleural effusion: 0.0
46
- - Map Pleural thickening: 0.0
47
- - Mar 100 Pleural thickening: 0.0
48
- - Map Pulmonary fibrosis: 0.0
49
- - Mar 100 Pulmonary fibrosis: 0.0
50
 
51
  ## Model description
52
 
@@ -66,38 +34,16 @@ More information needed
66
 
67
  The following hyperparameters were used during training:
68
  - learning_rate: 1e-05
69
- - train_batch_size: 4
70
- - eval_batch_size: 4
71
  - seed: 42
72
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
73
  - lr_scheduler_type: linear
74
- - training_steps: 1000
75
  - mixed_precision_training: Native AMP
76
 
77
  ### Training results
78
 
79
- | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Aortic enlargement | Mar 100 Aortic enlargement | Map Atelectasis | Mar 100 Atelectasis | Map Cardiomegaly | Mar 100 Cardiomegaly | Map Ild | Mar 100 Ild | Map Infiltration | Mar 100 Infiltration | Map Lung opacity | Mar 100 Lung opacity | Map Nodule/mass | Mar 100 Nodule/mass | Map Other lesion | Mar 100 Other lesion | Map Pleural effusion | Mar 100 Pleural effusion | Map Pleural thickening | Mar 100 Pleural thickening | Map Pulmonary fibrosis | Mar 100 Pulmonary fibrosis |
80
- |:-------------:|:-------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:----------------------:|:--------------------------:|:---------------:|:-------------------:|:----------------:|:--------------------:|:-------:|:-----------:|:----------------:|:--------------------:|:----------------:|:--------------------:|:---------------:|:-------------------:|:----------------:|:--------------------:|:--------------------:|:------------------------:|:----------------------:|:--------------------------:|:----------------------:|:--------------------------:|
81
- | No log | 2.7778 | 50 | 3.9076 | 0.0006 | 0.0028 | 0.0002 | 0.0 | 0.0003 | 0.005 | 0.0 | 0.0057 | 0.021 | 0.0 | 0.0143 | 0.0845 | 0.0006 | 0.0415 | -1.0 | -1.0 | 0.0047 | 0.1472 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
82
- | 4.3075 | 5.5556 | 100 | 3.5001 | 0.0047 | 0.0127 | 0.0028 | 0.0 | 0.0033 | 0.0271 | 0.0 | 0.0317 | 0.0849 | 0.0 | 0.0774 | 0.2357 | 0.0223 | 0.3377 | -1.0 | -1.0 | 0.0199 | 0.4264 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
83
- | 4.3075 | 8.3333 | 150 | 3.2511 | 0.0079 | 0.0197 | 0.0044 | 0.0 | 0.005 | 0.0459 | 0.0029 | 0.0465 | 0.099 | 0.0 | 0.0962 | 0.3083 | 0.0296 | 0.3057 | 0.0411 | 0.5849 | 0.0 | 0.0 | -1.0 | -1.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
84
- | 3.7002 | 11.1111 | 200 | 3.0549 | 0.0084 | 0.0247 | 0.0047 | 0.0 | 0.0073 | 0.0297 | 0.0055 | 0.053 | 0.096 | 0.0 | 0.1022 | 0.2488 | 0.0317 | 0.3811 | 0.0439 | 0.483 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
85
- | 3.7002 | 13.8889 | 250 | 2.8268 | 0.0121 | 0.0276 | 0.0078 | 0.0 | 0.0073 | 0.052 | 0.0101 | 0.0591 | 0.1115 | 0.0 | 0.1193 | 0.3083 | 0.0356 | 0.4 | 0.0733 | 0.6038 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
86
- | 3.3402 | 16.6667 | 300 | 2.8859 | 0.016 | 0.0383 | 0.0123 | 0.0 | 0.0303 | 0.0413 | 0.0073 | 0.0857 | 0.1061 | 0.0 | 0.1212 | 0.2476 | 0.0591 | 0.4604 | 0.0846 | 0.4943 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
87
- | 3.3402 | 19.4444 | 350 | 2.8537 | 0.0166 | 0.0363 | 0.0124 | 0.0 | 0.0342 | 0.0553 | 0.0084 | 0.0748 | 0.1 | 0.0 | 0.1113 | 0.2393 | 0.0381 | 0.4264 | 0.1117 | 0.4736 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
88
- | 3.2237 | 22.2222 | 400 | 2.7335 | 0.0177 | 0.0419 | 0.0131 | 0.0 | 0.0277 | 0.0508 | 0.0096 | 0.0925 | 0.1107 | 0.0 | 0.1143 | 0.2833 | 0.0629 | 0.4509 | 0.0963 | 0.5453 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
89
- | 3.2237 | 25.0 | 450 | 2.7128 | 0.0209 | 0.0499 | 0.0155 | 0.0 | 0.0236 | 0.0642 | 0.0157 | 0.0922 | 0.1145 | 0.0 | 0.1176 | 0.2964 | 0.0741 | 0.4679 | 0.114 | 0.5623 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
90
- | 3.0321 | 27.7778 | 500 | 2.6574 | 0.0197 | 0.0433 | 0.0137 | 0.0 | 0.035 | 0.0498 | 0.0143 | 0.0973 | 0.1143 | 0.0 | 0.1192 | 0.2881 | 0.082 | 0.4717 | 0.095 | 0.5566 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
91
- | 3.0321 | 30.5556 | 550 | 2.6041 | 0.0206 | 0.0504 | 0.0158 | 0.0 | 0.0375 | 0.0616 | 0.0184 | 0.0939 | 0.1202 | 0.0 | 0.1204 | 0.3036 | 0.0693 | 0.4585 | 0.1157 | 0.5736 | 0.0 | 0.0 | 0.0001 | 0.05 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
92
- | 2.9455 | 33.3333 | 600 | 2.6164 | 0.0252 | 0.0548 | 0.0214 | 0.0 | 0.0413 | 0.0575 | 0.0212 | 0.0979 | 0.1174 | 0.0 | 0.1286 | 0.3083 | 0.1141 | 0.4472 | 0.1131 | 0.6094 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
93
- | 2.9455 | 36.1111 | 650 | 2.5827 | 0.023 | 0.0516 | 0.0172 | 0.0 | 0.0321 | 0.0486 | 0.0191 | 0.0948 | 0.1212 | 0.0 | 0.1322 | 0.3119 | 0.1166 | 0.4755 | 0.0908 | 0.6151 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
94
- | 2.8715 | 38.8889 | 700 | 2.5590 | 0.028 | 0.0587 | 0.0232 | 0.0 | 0.0333 | 0.0607 | 0.0247 | 0.0992 | 0.1413 | 0.0 | 0.1453 | 0.3143 | 0.1387 | 0.4717 | 0.1123 | 0.6 | 0.0 | 0.0 | 0.0006 | 0.2 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
95
- | 2.8715 | 41.6667 | 750 | 2.5453 | 0.0264 | 0.0579 | 0.0192 | 0.0 | 0.0273 | 0.0678 | 0.0243 | 0.0975 | 0.1193 | 0.0 | 0.1275 | 0.3155 | 0.1166 | 0.4566 | 0.121 | 0.617 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
96
- | 2.7836 | 44.4444 | 800 | 2.5490 | 0.0249 | 0.0559 | 0.0173 | 0.0 | 0.0291 | 0.0634 | 0.0226 | 0.0992 | 0.1166 | 0.0 | 0.1223 | 0.3083 | 0.1078 | 0.4509 | 0.1163 | 0.5981 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
97
- | 2.7836 | 47.2222 | 850 | 2.5600 | 0.0256 | 0.0593 | 0.0187 | 0.0 | 0.034 | 0.0628 | 0.0189 | 0.0962 | 0.1195 | 0.0 | 0.1292 | 0.3083 | 0.1153 | 0.4698 | 0.1152 | 0.6057 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
98
- | 2.7591 | 50.0 | 900 | 2.5640 | 0.0258 | 0.0574 | 0.0185 | 0.0 | 0.0336 | 0.0636 | 0.0205 | 0.1039 | 0.1219 | 0.0 | 0.1319 | 0.3036 | 0.1159 | 0.4509 | 0.1162 | 0.5962 | 0.0 | 0.0 | 0.0001 | 0.05 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
99
- | 2.7591 | 52.7778 | 950 | 2.5807 | 0.0272 | 0.0621 | 0.0175 | 0.0 | 0.0308 | 0.0777 | 0.0203 | 0.0981 | 0.1149 | 0.0 | 0.1252 | 0.3036 | 0.109 | 0.4377 | 0.1361 | 0.5962 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
100
- | 2.7396 | 55.5556 | 1000 | 2.5675 | 0.0275 | 0.0634 | 0.0182 | 0.0 | 0.032 | 0.0752 | 0.0195 | 0.0979 | 0.1196 | 0.0 | 0.1307 | 0.3012 | 0.1155 | 0.4321 | 0.132 | 0.5943 | 0.0 | 0.0 | 0.0001 | 0.05 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
101
 
102
 
103
  ### Framework versions
 
15
  # detr-resnet-50-dc5-fashionpedia-finetuned
16
 
17
  This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18
 
19
  ## Model description
20
 
 
34
 
35
  The following hyperparameters were used during training:
36
  - learning_rate: 1e-05
37
+ - train_batch_size: 2
38
+ - eval_batch_size: 2
39
  - seed: 42
40
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
41
  - lr_scheduler_type: linear
42
+ - training_steps: 3
43
  - mixed_precision_training: Native AMP
44
 
45
  ### Training results
46
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47
 
48
 
49
  ### Framework versions