qubvel-hf HF staff commited on
Commit
e2f3f03
·
verified ·
1 Parent(s): c5f6d10

End of training

Browse files
Files changed (3) hide show
  1. README.md +34 -33
  2. model.safetensors +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -14,29 +14,29 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 9.9243
18
- - Map: 0.4532
19
- - Map 50: 0.66
20
- - Map 75: 0.5228
21
- - Map Small: 0.431
22
- - Map Medium: 0.3515
23
- - Map Large: 0.5415
24
- - Mar 1: 0.3644
25
- - Mar 10: 0.6286
26
- - Mar 100: 0.6927
27
- - Mar Small: 0.5962
28
- - Mar Medium: 0.5879
29
- - Mar Large: 0.81
30
- - Map Coverall: 0.4755
31
- - Mar 100 Coverall: 0.7974
32
- - Map Face Shield: 0.4919
33
- - Mar 100 Face Shield: 0.7176
34
- - Map Gloves: 0.3847
35
- - Mar 100 Gloves: 0.6593
36
- - Map Goggles: 0.3127
37
- - Mar 100 Goggles: 0.5793
38
- - Map Mask: 0.6013
39
- - Mar 100 Mask: 0.7098
40
 
41
  ## Model description
42
 
@@ -61,22 +61,23 @@ The following hyperparameters were used during training:
61
  - seed: 42
62
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
63
  - lr_scheduler_type: linear
 
64
  - num_epochs: 10
65
 
66
  ### Training results
67
 
68
  | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
69
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
70
- | No log | 1.0 | 107 | 138.4975 | 0.0441 | 0.0808 | 0.0353 | 0.0 | 0.0247 | 0.056 | 0.0547 | 0.1243 | 0.1461 | 0.0 | 0.082 | 0.2388 | 0.2204 | 0.5937 | 0.0001 | 0.0759 | 0.0 | 0.0201 | 0.0 | 0.02 | 0.0 | 0.0209 |
71
- | No log | 2.0 | 214 | 23.3748 | 0.0916 | 0.1786 | 0.0747 | 0.0461 | 0.0467 | 0.0912 | 0.1138 | 0.269 | 0.3528 | 0.2138 | 0.2623 | 0.4998 | 0.3271 | 0.6284 | 0.0041 | 0.3076 | 0.0078 | 0.2701 | 0.0034 | 0.2246 | 0.1156 | 0.3333 |
72
- | No log | 3.0 | 321 | 13.3702 | 0.2057 | 0.3793 | 0.196 | 0.1007 | 0.1548 | 0.3415 | 0.2296 | 0.4115 | 0.4959 | 0.2755 | 0.4268 | 0.7117 | 0.4253 | 0.6986 | 0.0393 | 0.5051 | 0.143 | 0.4183 | 0.1092 | 0.3938 | 0.3119 | 0.4636 |
73
- | No log | 4.0 | 428 | 12.8750 | 0.2236 | 0.4139 | 0.218 | 0.12 | 0.1699 | 0.4095 | 0.225 | 0.4324 | 0.5089 | 0.296 | 0.4525 | 0.7051 | 0.3626 | 0.6342 | 0.0964 | 0.5253 | 0.117 | 0.3996 | 0.2042 | 0.4631 | 0.3377 | 0.5222 |
74
- | 90.5185 | 5.0 | 535 | 11.9853 | 0.2701 | 0.4731 | 0.2752 | 0.192 | 0.1984 | 0.475 | 0.2573 | 0.4629 | 0.5406 | 0.357 | 0.4739 | 0.7304 | 0.4639 | 0.6973 | 0.1397 | 0.5443 | 0.2001 | 0.5134 | 0.2089 | 0.4354 | 0.3381 | 0.5124 |
75
- | 90.5185 | 6.0 | 642 | 12.6566 | 0.2422 | 0.4501 | 0.2296 | 0.2014 | 0.1863 | 0.425 | 0.2339 | 0.4469 | 0.5379 | 0.3612 | 0.4893 | 0.7289 | 0.3361 | 0.5752 | 0.1231 | 0.5329 | 0.1813 | 0.5272 | 0.2314 | 0.5108 | 0.3393 | 0.5436 |
76
- | 90.5185 | 7.0 | 749 | 12.7385 | 0.2411 | 0.432 | 0.2334 | 0.1769 | 0.1784 | 0.442 | 0.2291 | 0.4407 | 0.5321 | 0.3208 | 0.4863 | 0.7248 | 0.3662 | 0.6527 | 0.115 | 0.5114 | 0.1671 | 0.4969 | 0.2244 | 0.4677 | 0.3328 | 0.532 |
77
- | 90.5185 | 8.0 | 856 | 12.8410 | 0.2614 | 0.4702 | 0.2516 | 0.1796 | 0.1916 | 0.4767 | 0.2389 | 0.451 | 0.5373 | 0.3511 | 0.4776 | 0.7404 | 0.3826 | 0.6739 | 0.1451 | 0.5456 | 0.2148 | 0.5022 | 0.2567 | 0.4646 | 0.3078 | 0.5 |
78
- | 90.5185 | 9.0 | 963 | 13.1283 | 0.1857 | 0.3361 | 0.1772 | 0.1922 | 0.1448 | 0.3403 | 0.2197 | 0.4346 | 0.5488 | 0.368 | 0.5015 | 0.7352 | 0.2542 | 0.6599 | 0.0948 | 0.5392 | 0.0841 | 0.5022 | 0.211 | 0.5062 | 0.2846 | 0.5364 |
79
- | 13.6999 | 10.0 | 1070 | 12.8353 | 0.2457 | 0.4365 | 0.2273 | 0.1837 | 0.1881 | 0.4385 | 0.2388 | 0.4518 | 0.5494 | 0.3529 | 0.4936 | 0.7493 | 0.3722 | 0.6748 | 0.1472 | 0.5671 | 0.1703 | 0.496 | 0.2429 | 0.4831 | 0.296 | 0.5262 |
80
 
81
 
82
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co/PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 9.7524
18
+ - Map: 0.5298
19
+ - Map 50: 0.7903
20
+ - Map 75: 0.5632
21
+ - Map Small: 0.5092
22
+ - Map Medium: 0.4212
23
+ - Map Large: 0.6655
24
+ - Mar 1: 0.4001
25
+ - Mar 10: 0.6526
26
+ - Mar 100: 0.711
27
+ - Mar Small: 0.6038
28
+ - Mar Medium: 0.5835
29
+ - Mar Large: 0.8378
30
+ - Map Coverall: 0.6271
31
+ - Mar 100 Coverall: 0.8308
32
+ - Map Face Shield: 0.4839
33
+ - Mar 100 Face Shield: 0.7706
34
+ - Map Gloves: 0.5775
35
+ - Mar 100 Gloves: 0.6492
36
+ - Map Goggles: 0.425
37
+ - Mar 100 Goggles: 0.6103
38
+ - Map Mask: 0.5354
39
+ - Mar 100 Mask: 0.6941
40
 
41
  ## Model description
42
 
 
61
  - seed: 42
62
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
63
  - lr_scheduler_type: linear
64
+ - lr_scheduler_warmup_steps: 300
65
  - num_epochs: 10
66
 
67
  ### Training results
68
 
69
  | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
70
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
71
+ | No log | 1.0 | 107 | 216.6647 | 0.0037 | 0.0089 | 0.0022 | 0.0032 | 0.0183 | 0.014 | 0.0242 | 0.1046 | 0.1966 | 0.0405 | 0.1831 | 0.4092 | 0.0056 | 0.2649 | 0.001 | 0.1962 | 0.0021 | 0.0719 | 0.0008 | 0.2215 | 0.0091 | 0.2284 |
72
+ | No log | 2.0 | 214 | 96.4364 | 0.0294 | 0.0559 | 0.0257 | 0.0169 | 0.0297 | 0.0299 | 0.0707 | 0.1835 | 0.298 | 0.0948 | 0.2203 | 0.4591 | 0.0888 | 0.5527 | 0.001 | 0.3203 | 0.021 | 0.1259 | 0.0014 | 0.2154 | 0.0346 | 0.2756 |
73
+ | No log | 3.0 | 321 | 28.5504 | 0.1576 | 0.294 | 0.1448 | 0.0752 | 0.0925 | 0.2629 | 0.1621 | 0.3534 | 0.4661 | 0.347 | 0.3964 | 0.6546 | 0.4399 | 0.6518 | 0.0021 | 0.3797 | 0.1282 | 0.3866 | 0.0045 | 0.4 | 0.2132 | 0.5124 |
74
+ | No log | 4.0 | 428 | 17.1997 | 0.2324 | 0.408 | 0.2295 | 0.1228 | 0.1816 | 0.3288 | 0.2317 | 0.4133 | 0.5 | 0.3527 | 0.4438 | 0.6543 | 0.5101 | 0.6396 | 0.0093 | 0.4671 | 0.1827 | 0.4513 | 0.1553 | 0.4062 | 0.3045 | 0.5356 |
75
+ | 117.1144 | 5.0 | 535 | 14.8812 | 0.2495 | 0.4498 | 0.2479 | 0.1261 | 0.1962 | 0.4086 | 0.253 | 0.4388 | 0.5189 | 0.3485 | 0.4683 | 0.7111 | 0.5078 | 0.6752 | 0.0291 | 0.5013 | 0.2265 | 0.4491 | 0.1715 | 0.4246 | 0.3129 | 0.5444 |
76
+ | 117.1144 | 6.0 | 642 | 13.5348 | 0.2572 | 0.4698 | 0.2541 | 0.1377 | 0.1905 | 0.424 | 0.2532 | 0.4315 | 0.4895 | 0.314 | 0.4481 | 0.6649 | 0.5166 | 0.6716 | 0.026 | 0.4873 | 0.2391 | 0.3754 | 0.1866 | 0.3754 | 0.3178 | 0.5378 |
77
+ | 117.1144 | 7.0 | 749 | 12.7545 | 0.2812 | 0.5035 | 0.2612 | 0.1618 | 0.2143 | 0.4653 | 0.2595 | 0.4568 | 0.496 | 0.3394 | 0.4438 | 0.6648 | 0.5152 | 0.6815 | 0.0918 | 0.4949 | 0.2504 | 0.3759 | 0.208 | 0.3954 | 0.3405 | 0.5324 |
78
+ | 117.1144 | 8.0 | 856 | 12.5330 | 0.2909 | 0.5328 | 0.2687 | 0.1568 | 0.2262 | 0.4868 | 0.2831 | 0.4625 | 0.5035 | 0.3209 | 0.4428 | 0.686 | 0.5059 | 0.6838 | 0.1762 | 0.5038 | 0.2528 | 0.3978 | 0.1905 | 0.4062 | 0.3289 | 0.5258 |
79
+ | 117.1144 | 9.0 | 963 | 12.2873 | 0.3023 | 0.5355 | 0.2927 | 0.1621 | 0.2502 | 0.494 | 0.2851 | 0.4696 | 0.5064 | 0.3301 | 0.452 | 0.6736 | 0.5276 | 0.6932 | 0.1696 | 0.4899 | 0.2633 | 0.4085 | 0.2249 | 0.4154 | 0.326 | 0.5249 |
80
+ | 16.4463 | 10.0 | 1070 | 12.2585 | 0.3095 | 0.5506 | 0.3029 | 0.1738 | 0.2405 | 0.4996 | 0.2901 | 0.4721 | 0.5105 | 0.3271 | 0.4558 | 0.6864 | 0.5196 | 0.6892 | 0.2225 | 0.5241 | 0.264 | 0.4022 | 0.2102 | 0.4077 | 0.3309 | 0.5293 |
81
 
82
 
83
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:67d7208b8a71d2f543e8e3a7487b2f624699517e4908446002ec5c67180b7e11
3
  size 171559340
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c7bed672005b9eb9f6a0d9fc5eb736666d9f54df1f3564dd07ba325b8cbaa4e4
3
  size 171559340
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6aed1e9e0a8f73b486fa4b5a13be28e92b91f11e88f8189ebda9afb8ef4148ad
3
  size 5112
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7814c879d3e731028ce6719d78c7ee271d06922bd2197a741e415b083c0dabcb
3
  size 5112