bdpc's picture
Saving best model to hub
b5c4618
---
license: apache-2.0
base_model: microsoft/resnet-50
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: resnet101_rvl-cdip-_rvl_cdip-NK1000__CEKD_t2.5_a0.5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# resnet101_rvl-cdip-_rvl_cdip-NK1000__CEKD_t2.5_a0.5
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6065
- Accuracy: 0.7915
- Brier Loss: 0.3054
- Nll: 1.9957
- F1 Micro: 0.7915
- F1 Macro: 0.7910
- Ece: 0.0453
- Aurc: 0.0607
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
| No log | 1.0 | 250 | 4.1565 | 0.1378 | 0.9318 | 7.9039 | 0.1378 | 0.1073 | 0.0673 | 0.8326 |
| 4.1485 | 2.0 | 500 | 3.6932 | 0.3235 | 0.8832 | 5.1525 | 0.3235 | 0.2725 | 0.2044 | 0.5507 |
| 4.1485 | 3.0 | 750 | 2.3374 | 0.4725 | 0.6611 | 3.3127 | 0.4725 | 0.4311 | 0.0839 | 0.2921 |
| 2.392 | 4.0 | 1000 | 1.6516 | 0.588 | 0.5470 | 2.8681 | 0.588 | 0.5789 | 0.0620 | 0.1929 |
| 2.392 | 5.0 | 1250 | 1.3260 | 0.6488 | 0.4782 | 2.6378 | 0.6488 | 0.6444 | 0.0486 | 0.1458 |
| 1.1422 | 6.0 | 1500 | 1.0390 | 0.702 | 0.4156 | 2.4086 | 0.702 | 0.7029 | 0.0576 | 0.1097 |
| 1.1422 | 7.0 | 1750 | 0.8420 | 0.7288 | 0.3738 | 2.2222 | 0.7288 | 0.7300 | 0.0553 | 0.0888 |
| 0.708 | 8.0 | 2000 | 0.7753 | 0.7398 | 0.3586 | 2.1518 | 0.7398 | 0.7396 | 0.0587 | 0.0826 |
| 0.708 | 9.0 | 2250 | 0.7797 | 0.7462 | 0.3580 | 2.1095 | 0.7462 | 0.7457 | 0.0581 | 0.0820 |
| 0.5195 | 10.0 | 2500 | 0.7101 | 0.7602 | 0.3404 | 2.0711 | 0.7602 | 0.7612 | 0.0473 | 0.0733 |
| 0.5195 | 11.0 | 2750 | 0.6971 | 0.7645 | 0.3338 | 2.0649 | 0.7645 | 0.7653 | 0.0541 | 0.0715 |
| 0.4176 | 12.0 | 3000 | 0.6936 | 0.7712 | 0.3302 | 2.0265 | 0.7712 | 0.7708 | 0.0515 | 0.0702 |
| 0.4176 | 13.0 | 3250 | 0.6991 | 0.7662 | 0.3346 | 2.0582 | 0.7663 | 0.7657 | 0.0581 | 0.0723 |
| 0.3573 | 14.0 | 3500 | 0.6672 | 0.7722 | 0.3246 | 2.0053 | 0.7722 | 0.7723 | 0.0551 | 0.0683 |
| 0.3573 | 15.0 | 3750 | 0.6735 | 0.777 | 0.3244 | 2.0387 | 0.777 | 0.7782 | 0.0488 | 0.0671 |
| 0.3193 | 16.0 | 4000 | 0.6567 | 0.776 | 0.3216 | 2.0256 | 0.776 | 0.7773 | 0.0499 | 0.0678 |
| 0.3193 | 17.0 | 4250 | 0.6498 | 0.78 | 0.3184 | 1.9865 | 0.78 | 0.7802 | 0.0477 | 0.0662 |
| 0.2893 | 18.0 | 4500 | 0.6763 | 0.7755 | 0.3264 | 2.0844 | 0.7755 | 0.7755 | 0.0531 | 0.0697 |
| 0.2893 | 19.0 | 4750 | 0.6519 | 0.7815 | 0.3183 | 2.0458 | 0.7815 | 0.7817 | 0.0513 | 0.0658 |
| 0.271 | 20.0 | 5000 | 0.6432 | 0.7823 | 0.3147 | 2.0291 | 0.7823 | 0.7827 | 0.0440 | 0.0645 |
| 0.271 | 21.0 | 5250 | 0.6456 | 0.781 | 0.3156 | 2.0493 | 0.7810 | 0.7813 | 0.0487 | 0.0652 |
| 0.2516 | 22.0 | 5500 | 0.6336 | 0.7823 | 0.3144 | 1.9829 | 0.7823 | 0.7822 | 0.0522 | 0.0642 |
| 0.2516 | 23.0 | 5750 | 0.6333 | 0.7837 | 0.3128 | 2.0196 | 0.7837 | 0.7836 | 0.0492 | 0.0641 |
| 0.2397 | 24.0 | 6000 | 0.6337 | 0.7817 | 0.3147 | 2.0180 | 0.7817 | 0.7815 | 0.0494 | 0.0644 |
| 0.2397 | 25.0 | 6250 | 0.6347 | 0.7857 | 0.3145 | 2.0187 | 0.7857 | 0.7856 | 0.0510 | 0.0641 |
| 0.23 | 26.0 | 6500 | 0.6311 | 0.7815 | 0.3129 | 2.0132 | 0.7815 | 0.7819 | 0.0495 | 0.0637 |
| 0.23 | 27.0 | 6750 | 0.6329 | 0.7853 | 0.3125 | 2.0708 | 0.7853 | 0.7852 | 0.0502 | 0.0635 |
| 0.2191 | 28.0 | 7000 | 0.6222 | 0.786 | 0.3109 | 2.0022 | 0.786 | 0.7856 | 0.0483 | 0.0638 |
| 0.2191 | 29.0 | 7250 | 0.6195 | 0.7863 | 0.3096 | 2.0028 | 0.7863 | 0.7859 | 0.0550 | 0.0620 |
| 0.2155 | 30.0 | 7500 | 0.6196 | 0.7883 | 0.3090 | 1.9972 | 0.7883 | 0.7883 | 0.0486 | 0.0624 |
| 0.2155 | 31.0 | 7750 | 0.6167 | 0.787 | 0.3080 | 2.0173 | 0.787 | 0.7871 | 0.0443 | 0.0623 |
| 0.2074 | 32.0 | 8000 | 0.6143 | 0.7897 | 0.3073 | 2.0223 | 0.7897 | 0.7893 | 0.0443 | 0.0614 |
| 0.2074 | 33.0 | 8250 | 0.6123 | 0.787 | 0.3078 | 1.9869 | 0.787 | 0.7866 | 0.0458 | 0.0619 |
| 0.2028 | 34.0 | 8500 | 0.6137 | 0.7873 | 0.3070 | 1.9883 | 0.7873 | 0.7868 | 0.0457 | 0.0623 |
| 0.2028 | 35.0 | 8750 | 0.6152 | 0.786 | 0.3085 | 2.0108 | 0.786 | 0.7863 | 0.0497 | 0.0626 |
| 0.1982 | 36.0 | 9000 | 0.6133 | 0.7863 | 0.3077 | 2.0205 | 0.7863 | 0.7862 | 0.0515 | 0.0615 |
| 0.1982 | 37.0 | 9250 | 0.6145 | 0.7877 | 0.3081 | 1.9930 | 0.7877 | 0.7879 | 0.0444 | 0.0621 |
| 0.1948 | 38.0 | 9500 | 0.6116 | 0.7857 | 0.3078 | 2.0072 | 0.7857 | 0.7854 | 0.0508 | 0.0619 |
| 0.1948 | 39.0 | 9750 | 0.6090 | 0.788 | 0.3059 | 1.9954 | 0.788 | 0.7882 | 0.0430 | 0.0614 |
| 0.1933 | 40.0 | 10000 | 0.6143 | 0.7897 | 0.3072 | 1.9943 | 0.7897 | 0.7899 | 0.0462 | 0.0618 |
| 0.1933 | 41.0 | 10250 | 0.6061 | 0.7887 | 0.3041 | 1.9900 | 0.7887 | 0.7889 | 0.0439 | 0.0606 |
| 0.1882 | 42.0 | 10500 | 0.6070 | 0.7865 | 0.3058 | 1.9907 | 0.7865 | 0.7868 | 0.0438 | 0.0607 |
| 0.1882 | 43.0 | 10750 | 0.6083 | 0.788 | 0.3054 | 2.0095 | 0.788 | 0.7877 | 0.0489 | 0.0608 |
| 0.1871 | 44.0 | 11000 | 0.6083 | 0.787 | 0.3054 | 1.9828 | 0.787 | 0.7872 | 0.0469 | 0.0607 |
| 0.1871 | 45.0 | 11250 | 0.6092 | 0.7893 | 0.3057 | 2.0140 | 0.7893 | 0.7891 | 0.0483 | 0.0608 |
| 0.1862 | 46.0 | 11500 | 0.6057 | 0.7893 | 0.3053 | 2.0064 | 0.7893 | 0.7890 | 0.0450 | 0.0609 |
| 0.1862 | 47.0 | 11750 | 0.6042 | 0.79 | 0.3044 | 1.9691 | 0.79 | 0.7899 | 0.0435 | 0.0607 |
| 0.1845 | 48.0 | 12000 | 0.6068 | 0.79 | 0.3053 | 2.0052 | 0.79 | 0.7899 | 0.0438 | 0.0608 |
| 0.1845 | 49.0 | 12250 | 0.6081 | 0.7893 | 0.3062 | 2.0117 | 0.7893 | 0.7890 | 0.0485 | 0.0612 |
| 0.1836 | 50.0 | 12500 | 0.6065 | 0.7915 | 0.3054 | 1.9957 | 0.7915 | 0.7910 | 0.0453 | 0.0607 |
### Framework versions
- Transformers 4.33.3
- Pytorch 2.2.0.dev20231002
- Datasets 2.7.1
- Tokenizers 0.13.3