jordyvl commited on
Commit
de8afc5
·
1 Parent(s): be08bcc

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +114 -112
README.md CHANGED
@@ -14,14 +14,16 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  # swin-base_tobacco
16
 
17
- This model is a fine-tuned version of [microsoft/swinv2-base-patch4-window12-192-22k](https://huggingface.co/microsoft/swinv2-base-patch4-window12-192-22k) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: nan
20
- - Accuracy: 0.05
21
- - Brier Loss: nan
22
- - F1 Micro: 0.0500
23
- - F1 Macro: 0.0095
24
- - Aurc: 0.9597
 
 
25
 
26
  ## Model description
27
 
@@ -41,11 +43,11 @@ More information needed
41
 
42
  The following hyperparameters were used during training:
43
  - learning_rate: 2e-05
44
- - train_batch_size: 8
45
- - eval_batch_size: 8
46
  - seed: 42
47
  - gradient_accumulation_steps: 16
48
- - total_train_batch_size: 128
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
  - lr_scheduler_warmup_ratio: 0.1
@@ -53,108 +55,108 @@ The following hyperparameters were used during training:
53
 
54
  ### Training results
55
 
56
- | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
57
- |:-------------------------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
58
- | No log | 0.96 | 6 | 2.2525 | 0.195 | 0.8863 | 7.4916 | 0.195 | 0.1222 | 0.2608 | 0.7761 |
59
- | No log | 1.96 | 12 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
60
- | No log | 2.96 | 18 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
61
- | No log | 3.96 | 24 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
62
- | No log | 4.96 | 30 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
63
- | No log | 5.96 | 36 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
64
- | No log | 6.96 | 42 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
65
- | No log | 7.96 | 48 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
66
- | No log | 8.96 | 54 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
67
- | No log | 9.96 | 60 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
68
- | No log | 10.96 | 66 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
69
- | No log | 11.96 | 72 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
70
- | No log | 12.96 | 78 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
71
- | No log | 13.96 | 84 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
72
- | No log | 14.96 | 90 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
73
- | No log | 15.96 | 96 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
74
- | No log | 16.96 | 102 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
75
- | No log | 17.96 | 108 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
76
- | No log | 18.96 | 114 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
77
- | No log | 19.96 | 120 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
78
- | No log | 20.96 | 126 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
79
- | No log | 21.96 | 132 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
80
- | No log | 22.96 | 138 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
81
- | No log | 23.96 | 144 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
82
- | No log | 24.96 | 150 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
83
- | No log | 25.96 | 156 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
84
- | No log | 26.96 | 162 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
85
- | No log | 27.96 | 168 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
86
- | No log | 28.96 | 174 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
87
- | No log | 29.96 | 180 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
88
- | No log | 30.96 | 186 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
89
- | No log | 31.96 | 192 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
90
- | No log | 32.96 | 198 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
91
- | No log | 33.96 | 204 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
92
- | No log | 34.96 | 210 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
93
- | No log | 35.96 | 216 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
94
- | No log | 36.96 | 222 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
95
- | No log | 37.96 | 228 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
96
- | No log | 38.96 | 234 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
97
- | No log | 39.96 | 240 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
98
- | No log | 40.96 | 246 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
99
- | No log | 41.96 | 252 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
100
- | No log | 42.96 | 258 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
101
- | No log | 43.96 | 264 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
102
- | No log | 44.96 | 270 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
103
- | No log | 45.96 | 276 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
104
- | No log | 46.96 | 282 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
105
- | No log | 47.96 | 288 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
106
- | No log | 48.96 | 294 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
107
- | No log | 49.96 | 300 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
108
- | No log | 50.96 | 306 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
109
- | No log | 51.96 | 312 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
110
- | No log | 52.96 | 318 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
111
- | No log | 53.96 | 324 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
112
- | No log | 54.96 | 330 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
113
- | No log | 55.96 | 336 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
114
- | No log | 56.96 | 342 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
115
- | No log | 57.96 | 348 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
116
- | No log | 58.96 | 354 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
117
- | No log | 59.96 | 360 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
118
- | No log | 60.96 | 366 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
119
- | No log | 61.96 | 372 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
120
- | No log | 62.96 | 378 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
121
- | No log | 63.96 | 384 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
122
- | No log | 64.96 | 390 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
123
- | No log | 65.96 | 396 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
124
- | No log | 66.96 | 402 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
125
- | No log | 67.96 | 408 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
126
- | No log | 68.96 | 414 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
127
- | No log | 69.96 | 420 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
128
- | No log | 70.96 | 426 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
129
- | No log | 71.96 | 432 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
130
- | No log | 72.96 | 438 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
131
- | No log | 73.96 | 444 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
132
- | No log | 74.96 | 450 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
133
- | No log | 75.96 | 456 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
134
- | No log | 76.96 | 462 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
135
- | No log | 77.96 | 468 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
136
- | No log | 78.96 | 474 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
137
- | No log | 79.96 | 480 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
138
- | No log | 80.96 | 486 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
139
- | No log | 81.96 | 492 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
140
- | No log | 82.96 | 498 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
141
- | 56472721816851557388386304.0000 | 83.96 | 504 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
142
- | 56472721816851557388386304.0000 | 84.96 | 510 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
143
- | 56472721816851557388386304.0000 | 85.96 | 516 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
144
- | 56472721816851557388386304.0000 | 86.96 | 522 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
145
- | 56472721816851557388386304.0000 | 87.96 | 528 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
146
- | 56472721816851557388386304.0000 | 88.96 | 534 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
147
- | 56472721816851557388386304.0000 | 89.96 | 540 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
148
- | 56472721816851557388386304.0000 | 90.96 | 546 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
149
- | 56472721816851557388386304.0000 | 91.96 | 552 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
150
- | 56472721816851557388386304.0000 | 92.96 | 558 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
151
- | 56472721816851557388386304.0000 | 93.96 | 564 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
152
- | 56472721816851557388386304.0000 | 94.96 | 570 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
153
- | 56472721816851557388386304.0000 | 95.96 | 576 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
154
- | 56472721816851557388386304.0000 | 96.96 | 582 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
155
- | 56472721816851557388386304.0000 | 97.96 | 588 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
156
- | 56472721816851557388386304.0000 | 98.96 | 594 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
157
- | 56472721816851557388386304.0000 | 99.96 | 600 | nan | 0.05 | nan | 0.0500 | 0.0095 | 0.9597 |
158
 
159
 
160
  ### Framework versions
 
14
 
15
  # swin-base_tobacco
16
 
17
+ This model is a fine-tuned version of [microsoft/swinv2-base-patch4-window8-256](https://huggingface.co/microsoft/swinv2-base-patch4-window8-256) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.6059
20
+ - Accuracy: 0.835
21
+ - Brier Loss: 0.2576
22
+ - Nll: 1.2824
23
+ - F1 Micro: 0.835
24
+ - F1 Macro: 0.8348
25
+ - Ece: 0.1310
26
+ - Aurc: 0.0387
27
 
28
  ## Model description
29
 
 
43
 
44
  The following hyperparameters were used during training:
45
  - learning_rate: 2e-05
46
+ - train_batch_size: 16
47
+ - eval_batch_size: 16
48
  - seed: 42
49
  - gradient_accumulation_steps: 16
50
+ - total_train_batch_size: 256
51
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
52
  - lr_scheduler_type: linear
53
  - lr_scheduler_warmup_ratio: 0.1
 
55
 
56
  ### Training results
57
 
58
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
59
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
60
+ | No log | 0.96 | 3 | 2.3165 | 0.11 | 0.9031 | 7.6310 | 0.11 | 0.0604 | 0.2004 | 0.8718 |
61
+ | No log | 1.96 | 6 | 2.2894 | 0.155 | 0.8975 | 6.8146 | 0.155 | 0.0944 | 0.2230 | 0.8555 |
62
+ | No log | 2.96 | 9 | 2.2481 | 0.215 | 0.8888 | 5.1480 | 0.2150 | 0.1472 | 0.2492 | 0.8119 |
63
+ | No log | 3.96 | 12 | 2.1955 | 0.275 | 0.8770 | 4.2879 | 0.275 | 0.1939 | 0.2844 | 0.6562 |
64
+ | No log | 4.96 | 15 | 2.1326 | 0.36 | 0.8619 | 3.8809 | 0.36 | 0.2199 | 0.3357 | 0.4962 |
65
+ | No log | 5.96 | 18 | 2.0568 | 0.375 | 0.8415 | 3.9254 | 0.375 | 0.2309 | 0.3377 | 0.4471 |
66
+ | No log | 6.96 | 21 | 1.9639 | 0.375 | 0.8126 | 3.8158 | 0.375 | 0.2319 | 0.3195 | 0.4534 |
67
+ | No log | 7.96 | 24 | 1.8621 | 0.375 | 0.7781 | 3.3244 | 0.375 | 0.2456 | 0.2924 | 0.4833 |
68
+ | No log | 8.96 | 27 | 1.7100 | 0.44 | 0.7273 | 2.8211 | 0.44 | 0.3136 | 0.3188 | 0.3515 |
69
+ | No log | 9.96 | 30 | 1.5377 | 0.535 | 0.6611 | 2.4560 | 0.535 | 0.4259 | 0.3557 | 0.2259 |
70
+ | No log | 10.96 | 33 | 1.3588 | 0.595 | 0.5825 | 2.3216 | 0.595 | 0.4933 | 0.2986 | 0.1795 |
71
+ | No log | 11.96 | 36 | 1.2072 | 0.62 | 0.5215 | 2.3831 | 0.62 | 0.5352 | 0.2927 | 0.1541 |
72
+ | No log | 12.96 | 39 | 1.0766 | 0.67 | 0.4715 | 2.2078 | 0.67 | 0.5966 | 0.2727 | 0.1219 |
73
+ | No log | 13.96 | 42 | 0.9699 | 0.675 | 0.4408 | 1.8028 | 0.675 | 0.5961 | 0.2568 | 0.1215 |
74
+ | No log | 14.96 | 45 | 0.8660 | 0.68 | 0.4011 | 1.4772 | 0.68 | 0.5978 | 0.2176 | 0.1014 |
75
+ | No log | 15.96 | 48 | 0.7907 | 0.725 | 0.3709 | 1.4755 | 0.7250 | 0.6768 | 0.2055 | 0.0904 |
76
+ | No log | 16.96 | 51 | 0.7362 | 0.75 | 0.3501 | 1.3822 | 0.75 | 0.7077 | 0.2042 | 0.0806 |
77
+ | No log | 17.96 | 54 | 0.6867 | 0.76 | 0.3322 | 1.3191 | 0.76 | 0.7177 | 0.1926 | 0.0724 |
78
+ | No log | 18.96 | 57 | 0.6572 | 0.78 | 0.3203 | 1.2996 | 0.78 | 0.7424 | 0.1920 | 0.0699 |
79
+ | No log | 19.96 | 60 | 0.6074 | 0.785 | 0.2967 | 1.3136 | 0.785 | 0.7686 | 0.1705 | 0.0589 |
80
+ | No log | 20.96 | 63 | 0.6050 | 0.795 | 0.2956 | 1.3729 | 0.795 | 0.7793 | 0.1762 | 0.0600 |
81
+ | No log | 21.96 | 66 | 0.5748 | 0.83 | 0.2785 | 1.3558 | 0.83 | 0.8113 | 0.1744 | 0.0529 |
82
+ | No log | 22.96 | 69 | 0.5722 | 0.815 | 0.2756 | 1.3937 | 0.815 | 0.8097 | 0.1767 | 0.0489 |
83
+ | No log | 23.96 | 72 | 0.5689 | 0.795 | 0.2750 | 1.3641 | 0.795 | 0.7947 | 0.1452 | 0.0539 |
84
+ | No log | 24.96 | 75 | 0.5536 | 0.825 | 0.2718 | 1.2773 | 0.825 | 0.8068 | 0.1698 | 0.0509 |
85
+ | No log | 25.96 | 78 | 0.5464 | 0.805 | 0.2726 | 1.2772 | 0.805 | 0.7888 | 0.1499 | 0.0487 |
86
+ | No log | 26.96 | 81 | 0.5455 | 0.81 | 0.2626 | 1.3607 | 0.81 | 0.8080 | 0.1750 | 0.0471 |
87
+ | No log | 27.96 | 84 | 0.5542 | 0.815 | 0.2609 | 1.3643 | 0.815 | 0.8089 | 0.1521 | 0.0466 |
88
+ | No log | 28.96 | 87 | 0.5480 | 0.82 | 0.2710 | 1.2996 | 0.82 | 0.8227 | 0.1422 | 0.0468 |
89
+ | No log | 29.96 | 90 | 0.5507 | 0.83 | 0.2654 | 1.3425 | 0.83 | 0.8320 | 0.1491 | 0.0475 |
90
+ | No log | 30.96 | 93 | 0.5608 | 0.815 | 0.2591 | 1.4365 | 0.815 | 0.8145 | 0.1405 | 0.0442 |
91
+ | No log | 31.96 | 96 | 0.5473 | 0.825 | 0.2622 | 1.3600 | 0.825 | 0.8198 | 0.1339 | 0.0424 |
92
+ | No log | 32.96 | 99 | 0.5296 | 0.83 | 0.2588 | 1.2906 | 0.83 | 0.8311 | 0.1373 | 0.0416 |
93
+ | No log | 33.96 | 102 | 0.5370 | 0.82 | 0.2522 | 1.2895 | 0.82 | 0.8214 | 0.1428 | 0.0436 |
94
+ | No log | 34.96 | 105 | 0.5578 | 0.8 | 0.2707 | 1.3364 | 0.8000 | 0.8056 | 0.1708 | 0.0481 |
95
+ | No log | 35.96 | 108 | 0.5193 | 0.825 | 0.2484 | 1.2883 | 0.825 | 0.8250 | 0.1316 | 0.0405 |
96
+ | No log | 36.96 | 111 | 0.5306 | 0.815 | 0.2569 | 1.2856 | 0.815 | 0.8093 | 0.1344 | 0.0420 |
97
+ | No log | 37.96 | 114 | 0.5824 | 0.815 | 0.2729 | 1.3994 | 0.815 | 0.8182 | 0.1418 | 0.0479 |
98
+ | No log | 38.96 | 117 | 0.5486 | 0.82 | 0.2549 | 1.2974 | 0.82 | 0.8259 | 0.1312 | 0.0443 |
99
+ | No log | 39.96 | 120 | 0.5421 | 0.83 | 0.2545 | 1.3575 | 0.83 | 0.8316 | 0.1491 | 0.0415 |
100
+ | No log | 40.96 | 123 | 0.5477 | 0.81 | 0.2700 | 1.3251 | 0.81 | 0.8166 | 0.1499 | 0.0418 |
101
+ | No log | 41.96 | 126 | 0.5404 | 0.825 | 0.2553 | 1.3186 | 0.825 | 0.8309 | 0.1519 | 0.0414 |
102
+ | No log | 42.96 | 129 | 0.5698 | 0.83 | 0.2598 | 1.3249 | 0.83 | 0.8386 | 0.1396 | 0.0452 |
103
+ | No log | 43.96 | 132 | 0.5538 | 0.815 | 0.2605 | 1.3122 | 0.815 | 0.8212 | 0.1410 | 0.0430 |
104
+ | No log | 44.96 | 135 | 0.5369 | 0.81 | 0.2586 | 1.3030 | 0.81 | 0.8141 | 0.1404 | 0.0409 |
105
+ | No log | 45.96 | 138 | 0.5614 | 0.825 | 0.2615 | 1.3881 | 0.825 | 0.8278 | 0.1404 | 0.0427 |
106
+ | No log | 46.96 | 141 | 0.5636 | 0.825 | 0.2601 | 1.4077 | 0.825 | 0.8286 | 0.1345 | 0.0421 |
107
+ | No log | 47.96 | 144 | 0.5783 | 0.83 | 0.2684 | 1.3350 | 0.83 | 0.8304 | 0.1373 | 0.0422 |
108
+ | No log | 48.96 | 147 | 0.5749 | 0.825 | 0.2663 | 1.3167 | 0.825 | 0.8241 | 0.1308 | 0.0424 |
109
+ | No log | 49.96 | 150 | 0.5802 | 0.82 | 0.2692 | 1.3191 | 0.82 | 0.8194 | 0.1217 | 0.0461 |
110
+ | No log | 50.96 | 153 | 0.5696 | 0.82 | 0.2639 | 1.3330 | 0.82 | 0.8175 | 0.1372 | 0.0429 |
111
+ | No log | 51.96 | 156 | 0.5827 | 0.84 | 0.2656 | 1.3975 | 0.8400 | 0.8444 | 0.1378 | 0.0426 |
112
+ | No log | 52.96 | 159 | 0.5725 | 0.805 | 0.2669 | 1.3172 | 0.805 | 0.7997 | 0.1459 | 0.0422 |
113
+ | No log | 53.96 | 162 | 0.5769 | 0.805 | 0.2691 | 1.3111 | 0.805 | 0.7991 | 0.1457 | 0.0434 |
114
+ | No log | 54.96 | 165 | 0.5883 | 0.805 | 0.2647 | 1.4581 | 0.805 | 0.8104 | 0.1405 | 0.0430 |
115
+ | No log | 55.96 | 168 | 0.5834 | 0.835 | 0.2543 | 1.4586 | 0.835 | 0.8349 | 0.1346 | 0.0407 |
116
+ | No log | 56.96 | 171 | 0.5875 | 0.835 | 0.2543 | 1.3211 | 0.835 | 0.8358 | 0.1320 | 0.0402 |
117
+ | No log | 57.96 | 174 | 0.5741 | 0.84 | 0.2533 | 1.3027 | 0.8400 | 0.8405 | 0.1290 | 0.0395 |
118
+ | No log | 58.96 | 177 | 0.5737 | 0.82 | 0.2624 | 1.3104 | 0.82 | 0.8167 | 0.1437 | 0.0396 |
119
+ | No log | 59.96 | 180 | 0.5796 | 0.815 | 0.2603 | 1.4021 | 0.815 | 0.8154 | 0.1286 | 0.0406 |
120
+ | No log | 60.96 | 183 | 0.5711 | 0.83 | 0.2553 | 1.4016 | 0.83 | 0.8306 | 0.1272 | 0.0390 |
121
+ | No log | 61.96 | 186 | 0.5670 | 0.825 | 0.2591 | 1.3136 | 0.825 | 0.8263 | 0.1429 | 0.0406 |
122
+ | No log | 62.96 | 189 | 0.5736 | 0.825 | 0.2592 | 1.3077 | 0.825 | 0.8231 | 0.1244 | 0.0417 |
123
+ | No log | 63.96 | 192 | 0.5730 | 0.83 | 0.2531 | 1.3007 | 0.83 | 0.8274 | 0.1275 | 0.0401 |
124
+ | No log | 64.96 | 195 | 0.6130 | 0.82 | 0.2687 | 1.3014 | 0.82 | 0.8246 | 0.1484 | 0.0414 |
125
+ | No log | 65.96 | 198 | 0.6023 | 0.825 | 0.2596 | 1.3107 | 0.825 | 0.8254 | 0.1373 | 0.0404 |
126
+ | No log | 66.96 | 201 | 0.5923 | 0.825 | 0.2599 | 1.3078 | 0.825 | 0.8263 | 0.1312 | 0.0411 |
127
+ | No log | 67.96 | 204 | 0.6197 | 0.81 | 0.2766 | 1.3046 | 0.81 | 0.8035 | 0.1373 | 0.0451 |
128
+ | No log | 68.96 | 207 | 0.5918 | 0.805 | 0.2651 | 1.3019 | 0.805 | 0.8044 | 0.1407 | 0.0404 |
129
+ | No log | 69.96 | 210 | 0.5908 | 0.835 | 0.2544 | 1.3286 | 0.835 | 0.8344 | 0.1354 | 0.0394 |
130
+ | No log | 70.96 | 213 | 0.5941 | 0.83 | 0.2558 | 1.3019 | 0.83 | 0.8324 | 0.1402 | 0.0401 |
131
+ | No log | 71.96 | 216 | 0.5994 | 0.82 | 0.2588 | 1.2998 | 0.82 | 0.8215 | 0.1297 | 0.0411 |
132
+ | No log | 72.96 | 219 | 0.6083 | 0.825 | 0.2638 | 1.3525 | 0.825 | 0.8257 | 0.1379 | 0.0410 |
133
+ | No log | 73.96 | 222 | 0.5980 | 0.825 | 0.2609 | 1.3515 | 0.825 | 0.8295 | 0.1457 | 0.0394 |
134
+ | No log | 74.96 | 225 | 0.5945 | 0.83 | 0.2568 | 1.3670 | 0.83 | 0.8302 | 0.1324 | 0.0390 |
135
+ | No log | 75.96 | 228 | 0.5982 | 0.845 | 0.2535 | 1.4552 | 0.845 | 0.8476 | 0.1246 | 0.0390 |
136
+ | No log | 76.96 | 231 | 0.5850 | 0.83 | 0.2507 | 1.3700 | 0.83 | 0.8287 | 0.1348 | 0.0391 |
137
+ | No log | 77.96 | 234 | 0.5859 | 0.825 | 0.2566 | 1.2917 | 0.825 | 0.8232 | 0.1309 | 0.0394 |
138
+ | No log | 78.96 | 237 | 0.6085 | 0.835 | 0.2630 | 1.3516 | 0.835 | 0.8370 | 0.1329 | 0.0420 |
139
+ | No log | 79.96 | 240 | 0.6108 | 0.835 | 0.2621 | 1.2943 | 0.835 | 0.8370 | 0.1395 | 0.0414 |
140
+ | No log | 80.96 | 243 | 0.6061 | 0.81 | 0.2596 | 1.2898 | 0.81 | 0.8119 | 0.1313 | 0.0413 |
141
+ | No log | 81.96 | 246 | 0.6006 | 0.815 | 0.2564 | 1.2952 | 0.815 | 0.8122 | 0.1453 | 0.0406 |
142
+ | No log | 82.96 | 249 | 0.6050 | 0.825 | 0.2577 | 1.2998 | 0.825 | 0.8283 | 0.1271 | 0.0400 |
143
+ | No log | 83.96 | 252 | 0.6197 | 0.835 | 0.2658 | 1.3021 | 0.835 | 0.8386 | 0.1222 | 0.0414 |
144
+ | No log | 84.96 | 255 | 0.6086 | 0.825 | 0.2651 | 1.2889 | 0.825 | 0.8251 | 0.1207 | 0.0404 |
145
+ | No log | 85.96 | 258 | 0.5965 | 0.83 | 0.2587 | 1.2929 | 0.83 | 0.8304 | 0.1323 | 0.0397 |
146
+ | No log | 86.96 | 261 | 0.5897 | 0.82 | 0.2550 | 1.2980 | 0.82 | 0.8171 | 0.1372 | 0.0394 |
147
+ | No log | 87.96 | 264 | 0.5887 | 0.83 | 0.2551 | 1.2950 | 0.83 | 0.8290 | 0.1251 | 0.0391 |
148
+ | No log | 88.96 | 267 | 0.5958 | 0.82 | 0.2598 | 1.2871 | 0.82 | 0.8180 | 0.1319 | 0.0392 |
149
+ | No log | 89.96 | 270 | 0.6088 | 0.82 | 0.2658 | 1.2805 | 0.82 | 0.8184 | 0.1513 | 0.0396 |
150
+ | No log | 90.96 | 273 | 0.6192 | 0.825 | 0.2692 | 1.2772 | 0.825 | 0.8263 | 0.1258 | 0.0402 |
151
+ | No log | 91.96 | 276 | 0.6230 | 0.825 | 0.2689 | 1.2777 | 0.825 | 0.8263 | 0.1416 | 0.0404 |
152
+ | No log | 92.96 | 279 | 0.6223 | 0.83 | 0.2667 | 1.2792 | 0.83 | 0.8318 | 0.1296 | 0.0401 |
153
+ | No log | 93.96 | 282 | 0.6145 | 0.83 | 0.2627 | 1.2797 | 0.83 | 0.8321 | 0.1265 | 0.0394 |
154
+ | No log | 94.96 | 285 | 0.6105 | 0.83 | 0.2610 | 1.2807 | 0.83 | 0.8321 | 0.1352 | 0.0392 |
155
+ | No log | 95.96 | 288 | 0.6095 | 0.83 | 0.2602 | 1.2815 | 0.83 | 0.8321 | 0.1360 | 0.0390 |
156
+ | No log | 96.96 | 291 | 0.6076 | 0.835 | 0.2590 | 1.2824 | 0.835 | 0.8348 | 0.1255 | 0.0389 |
157
+ | No log | 97.96 | 294 | 0.6060 | 0.835 | 0.2578 | 1.2827 | 0.835 | 0.8348 | 0.1281 | 0.0388 |
158
+ | No log | 98.96 | 297 | 0.6058 | 0.835 | 0.2575 | 1.2825 | 0.835 | 0.8348 | 0.1410 | 0.0387 |
159
+ | No log | 99.96 | 300 | 0.6059 | 0.835 | 0.2576 | 1.2824 | 0.835 | 0.8348 | 0.1310 | 0.0387 |
160
 
161
 
162
  ### Framework versions