segformer-b0-finetuned-segments-sidewalk-2
This model is a fine-tuned version of nvidia/mit-b0 on the jhaberbe/amyloid-aggregates dataset. It achieves the following results on the evaluation set:
- Loss: 0.0552
- Mean Iou: 0.4126
- Mean Accuracy: 0.8251
- Overall Accuracy: 0.8251
- Accuracy Background: nan
- Accuracy Amyloid: 0.8251
- Iou Background: 0.0
- Iou Amyloid: 0.8251
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Amyloid | Iou Background | Iou Amyloid |
---|---|---|---|---|---|---|---|---|---|---|
0.5486 | 0.2326 | 20 | 0.6709 | 0.1174 | 0.2349 | 0.2349 | nan | 0.2349 | 0.0 | 0.2349 |
0.3496 | 0.4651 | 40 | 0.6179 | 0.2210 | 0.4420 | 0.4420 | nan | 0.4420 | 0.0 | 0.4420 |
0.2825 | 0.6977 | 60 | 0.4770 | 0.4871 | 0.9743 | 0.9743 | nan | 0.9743 | 0.0 | 0.9743 |
0.2814 | 0.9302 | 80 | 0.2872 | 0.1395 | 0.2790 | 0.2790 | nan | 0.2790 | 0.0 | 0.2790 |
0.2083 | 1.1628 | 100 | 0.2249 | 0.3871 | 0.7742 | 0.7742 | nan | 0.7742 | 0.0 | 0.7742 |
0.1355 | 1.3953 | 120 | 0.2623 | 0.2753 | 0.5507 | 0.5507 | nan | 0.5507 | 0.0 | 0.5507 |
0.1671 | 1.6279 | 140 | 0.1225 | 0.2283 | 0.4565 | 0.4565 | nan | 0.4565 | 0.0 | 0.4565 |
0.1035 | 1.8605 | 160 | 0.0822 | 0.1782 | 0.3564 | 0.3564 | nan | 0.3564 | 0.0 | 0.3564 |
0.097 | 2.0930 | 180 | 0.0783 | 0.1628 | 0.3255 | 0.3255 | nan | 0.3255 | 0.0 | 0.3255 |
0.0641 | 2.3256 | 200 | 0.1464 | 0.4488 | 0.8976 | 0.8976 | nan | 0.8976 | 0.0 | 0.8976 |
0.06 | 2.5581 | 220 | 0.0819 | 0.3430 | 0.6860 | 0.6860 | nan | 0.6860 | 0.0 | 0.6860 |
0.0511 | 2.7907 | 240 | 0.0835 | 0.2525 | 0.5049 | 0.5049 | nan | 0.5049 | 0.0 | 0.5049 |
0.0377 | 3.0233 | 260 | 0.0681 | 0.2144 | 0.4288 | 0.4288 | nan | 0.4288 | 0.0 | 0.4288 |
0.0381 | 3.2558 | 280 | 0.0975 | 0.4299 | 0.8598 | 0.8598 | nan | 0.8598 | 0.0 | 0.8598 |
0.0377 | 3.4884 | 300 | 0.1277 | 0.4671 | 0.9342 | 0.9342 | nan | 0.9342 | 0.0 | 0.9342 |
0.1049 | 3.7209 | 320 | 0.0754 | 0.4208 | 0.8417 | 0.8417 | nan | 0.8417 | 0.0 | 0.8417 |
0.0252 | 3.9535 | 340 | 0.0757 | 0.2233 | 0.4467 | 0.4467 | nan | 0.4467 | 0.0 | 0.4467 |
0.022 | 4.1860 | 360 | 0.0758 | 0.3314 | 0.6628 | 0.6628 | nan | 0.6628 | 0.0 | 0.6628 |
0.0223 | 4.4186 | 380 | 0.0840 | 0.3403 | 0.6806 | 0.6806 | nan | 0.6806 | 0.0 | 0.6806 |
0.0323 | 4.6512 | 400 | 0.0490 | 0.4347 | 0.8694 | 0.8694 | nan | 0.8694 | 0.0 | 0.8694 |
0.0298 | 4.8837 | 420 | 0.0618 | 0.3695 | 0.7391 | 0.7391 | nan | 0.7391 | 0.0 | 0.7391 |
0.0172 | 5.1163 | 440 | 0.0918 | 0.3770 | 0.7541 | 0.7541 | nan | 0.7541 | 0.0 | 0.7541 |
0.0171 | 5.3488 | 460 | 0.0955 | 0.1962 | 0.3924 | 0.3924 | nan | 0.3924 | 0.0 | 0.3924 |
0.0134 | 5.5814 | 480 | 0.1008 | 0.2416 | 0.4832 | 0.4832 | nan | 0.4832 | 0.0 | 0.4832 |
0.0153 | 5.8140 | 500 | 0.0997 | 0.2675 | 0.5351 | 0.5351 | nan | 0.5351 | 0.0 | 0.5351 |
0.0114 | 6.0465 | 520 | 0.0788 | 0.3411 | 0.6823 | 0.6823 | nan | 0.6823 | 0.0 | 0.6823 |
0.0126 | 6.2791 | 540 | 0.0947 | 0.0264 | 0.0527 | 0.0527 | nan | 0.0527 | 0.0 | 0.0527 |
0.0134 | 6.5116 | 560 | 0.0728 | 0.4199 | 0.8399 | 0.8399 | nan | 0.8399 | 0.0 | 0.8399 |
0.0098 | 6.7442 | 580 | 0.0522 | 0.2287 | 0.4574 | 0.4574 | nan | 0.4574 | 0.0 | 0.4574 |
0.0094 | 6.9767 | 600 | 0.0869 | 0.2789 | 0.5578 | 0.5578 | nan | 0.5578 | 0.0 | 0.5578 |
0.0123 | 7.2093 | 620 | 0.1064 | 0.3969 | 0.7937 | 0.7937 | nan | 0.7937 | 0.0 | 0.7937 |
0.017 | 7.4419 | 640 | 0.0700 | 0.2119 | 0.4237 | 0.4237 | nan | 0.4237 | 0.0 | 0.4237 |
0.0092 | 7.6744 | 660 | 0.0471 | 0.1847 | 0.3695 | 0.3695 | nan | 0.3695 | 0.0 | 0.3695 |
0.0083 | 7.9070 | 680 | 0.0701 | 0.3242 | 0.6485 | 0.6485 | nan | 0.6485 | 0.0 | 0.6485 |
0.0102 | 8.1395 | 700 | 0.0793 | 0.3773 | 0.7547 | 0.7547 | nan | 0.7547 | 0.0 | 0.7547 |
0.0114 | 8.3721 | 720 | 0.0596 | 0.3937 | 0.7873 | 0.7873 | nan | 0.7873 | 0.0 | 0.7873 |
0.0069 | 8.6047 | 740 | 0.0646 | 0.3988 | 0.7976 | 0.7976 | nan | 0.7976 | 0.0 | 0.7976 |
0.0084 | 8.8372 | 760 | 0.0686 | 0.3288 | 0.6577 | 0.6577 | nan | 0.6577 | 0.0 | 0.6577 |
0.008 | 9.0698 | 780 | 0.0658 | 0.3455 | 0.6910 | 0.6910 | nan | 0.6910 | 0.0 | 0.6910 |
0.0065 | 9.3023 | 800 | 0.0185 | 0.1691 | 0.3382 | 0.3382 | nan | 0.3382 | 0.0 | 0.3382 |
0.0063 | 9.5349 | 820 | 0.0732 | 0.2238 | 0.4477 | 0.4477 | nan | 0.4477 | 0.0 | 0.4477 |
0.0058 | 9.7674 | 840 | 0.0457 | 0.2353 | 0.4705 | 0.4705 | nan | 0.4705 | 0.0 | 0.4705 |
0.0063 | 10.0 | 860 | 0.0777 | 0.4177 | 0.8354 | 0.8354 | nan | 0.8354 | 0.0 | 0.8354 |
0.0218 | 10.2326 | 880 | 0.0761 | 0.2859 | 0.5718 | 0.5718 | nan | 0.5718 | 0.0 | 0.5718 |
0.0064 | 10.4651 | 900 | 0.0651 | 0.3579 | 0.7159 | 0.7159 | nan | 0.7159 | 0.0 | 0.7159 |
0.0052 | 10.6977 | 920 | 0.0776 | 0.3691 | 0.7383 | 0.7383 | nan | 0.7383 | 0.0 | 0.7383 |
0.0064 | 10.9302 | 940 | 0.0615 | 0.2193 | 0.4386 | 0.4386 | nan | 0.4386 | 0.0 | 0.4386 |
0.01 | 11.1628 | 960 | 0.0513 | 0.2410 | 0.4820 | 0.4820 | nan | 0.4820 | 0.0 | 0.4820 |
0.0106 | 11.3953 | 980 | 0.0739 | 0.3586 | 0.7172 | 0.7172 | nan | 0.7172 | 0.0 | 0.7172 |
0.0173 | 11.6279 | 1000 | 0.0189 | 0.3301 | 0.6601 | 0.6601 | nan | 0.6601 | 0.0 | 0.6601 |
0.0059 | 11.8605 | 1020 | 0.0555 | 0.3845 | 0.7689 | 0.7689 | nan | 0.7689 | 0.0 | 0.7689 |
0.0087 | 12.0930 | 1040 | 0.0496 | 0.3700 | 0.7399 | 0.7399 | nan | 0.7399 | 0.0 | 0.7399 |
0.0046 | 12.3256 | 1060 | 0.0709 | 0.3949 | 0.7899 | 0.7899 | nan | 0.7899 | 0.0 | 0.7899 |
0.0047 | 12.5581 | 1080 | 0.0866 | 0.3646 | 0.7292 | 0.7292 | nan | 0.7292 | 0.0 | 0.7292 |
0.0044 | 12.7907 | 1100 | 0.0595 | 0.3145 | 0.6291 | 0.6291 | nan | 0.6291 | 0.0 | 0.6291 |
0.005 | 13.0233 | 1120 | 0.0922 | 0.3124 | 0.6248 | 0.6248 | nan | 0.6248 | 0.0 | 0.6248 |
0.0043 | 13.2558 | 1140 | 0.0658 | 0.3818 | 0.7636 | 0.7636 | nan | 0.7636 | 0.0 | 0.7636 |
0.004 | 13.4884 | 1160 | 0.0800 | 0.3649 | 0.7297 | 0.7297 | nan | 0.7297 | 0.0 | 0.7297 |
0.0067 | 13.7209 | 1180 | 0.0537 | 0.3581 | 0.7161 | 0.7161 | nan | 0.7161 | 0.0 | 0.7161 |
0.0043 | 13.9535 | 1200 | 0.0735 | 0.2332 | 0.4663 | 0.4663 | nan | 0.4663 | 0.0 | 0.4663 |
0.0047 | 14.1860 | 1220 | 0.0647 | 0.3186 | 0.6373 | 0.6373 | nan | 0.6373 | 0.0 | 0.6373 |
0.0138 | 14.4186 | 1240 | 0.0663 | 0.3183 | 0.6366 | 0.6366 | nan | 0.6366 | 0.0 | 0.6366 |
0.0052 | 14.6512 | 1260 | 0.0504 | 0.3514 | 0.7027 | 0.7027 | nan | 0.7027 | 0.0 | 0.7027 |
0.0037 | 14.8837 | 1280 | 0.0839 | 0.3544 | 0.7087 | 0.7087 | nan | 0.7087 | 0.0 | 0.7087 |
0.0075 | 15.1163 | 1300 | 0.0708 | 0.4158 | 0.8316 | 0.8316 | nan | 0.8316 | 0.0 | 0.8316 |
0.0122 | 15.3488 | 1320 | 0.0835 | 0.3907 | 0.7813 | 0.7813 | nan | 0.7813 | 0.0 | 0.7813 |
0.0034 | 15.5814 | 1340 | 0.0808 | 0.1924 | 0.3848 | 0.3848 | nan | 0.3848 | 0.0 | 0.3848 |
0.0037 | 15.8140 | 1360 | 0.0619 | 0.3453 | 0.6907 | 0.6907 | nan | 0.6907 | 0.0 | 0.6907 |
0.0091 | 16.0465 | 1380 | 0.0918 | 0.4024 | 0.8048 | 0.8048 | nan | 0.8048 | 0.0 | 0.8048 |
0.0034 | 16.2791 | 1400 | 0.0614 | 0.3826 | 0.7652 | 0.7652 | nan | 0.7652 | 0.0 | 0.7652 |
0.0034 | 16.5116 | 1420 | 0.0661 | 0.3468 | 0.6936 | 0.6936 | nan | 0.6936 | 0.0 | 0.6936 |
0.0037 | 16.7442 | 1440 | 0.0442 | 0.2780 | 0.5560 | 0.5560 | nan | 0.5560 | 0.0 | 0.5560 |
0.0058 | 16.9767 | 1460 | 0.0694 | 0.3634 | 0.7267 | 0.7267 | nan | 0.7267 | 0.0 | 0.7267 |
0.0046 | 17.2093 | 1480 | 0.0433 | 0.3618 | 0.7237 | 0.7237 | nan | 0.7237 | 0.0 | 0.7237 |
0.0068 | 17.4419 | 1500 | 0.0718 | 0.3741 | 0.7481 | 0.7481 | nan | 0.7481 | 0.0 | 0.7481 |
0.0053 | 17.6744 | 1520 | 0.0786 | 0.4243 | 0.8485 | 0.8485 | nan | 0.8485 | 0.0 | 0.8485 |
0.0065 | 17.9070 | 1540 | 0.0759 | 0.3885 | 0.7771 | 0.7771 | nan | 0.7771 | 0.0 | 0.7771 |
0.0142 | 18.1395 | 1560 | 0.0707 | 0.3476 | 0.6952 | 0.6952 | nan | 0.6952 | 0.0 | 0.6952 |
0.0036 | 18.3721 | 1580 | 0.0543 | 0.3193 | 0.6386 | 0.6386 | nan | 0.6386 | 0.0 | 0.6386 |
0.003 | 18.6047 | 1600 | 0.0595 | 0.3440 | 0.6879 | 0.6879 | nan | 0.6879 | 0.0 | 0.6879 |
0.006 | 18.8372 | 1620 | 0.0515 | 0.3808 | 0.7616 | 0.7616 | nan | 0.7616 | 0.0 | 0.7616 |
0.0031 | 19.0698 | 1640 | 0.0533 | 0.3440 | 0.6879 | 0.6879 | nan | 0.6879 | 0.0 | 0.6879 |
0.0033 | 19.3023 | 1660 | 0.0758 | 0.3302 | 0.6605 | 0.6605 | nan | 0.6605 | 0.0 | 0.6605 |
0.0176 | 19.5349 | 1680 | 0.0598 | 0.2267 | 0.4534 | 0.4534 | nan | 0.4534 | 0.0 | 0.4534 |
0.0055 | 19.7674 | 1700 | 0.0354 | 0.3215 | 0.6430 | 0.6430 | nan | 0.6430 | 0.0 | 0.6430 |
0.0032 | 20.0 | 1720 | 0.0552 | 0.4126 | 0.8251 | 0.8251 | nan | 0.8251 | 0.0 | 0.8251 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.5.0+cu121
- Datasets 3.0.2
- Tokenizers 0.19.1
- Downloads last month
- 19
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for jhaberbe/segformer-b0-finetuned-segments-sidewalk-2
Base model
nvidia/mit-b0