Edit model card

detr-r50-finetuned-mist1-gb-4ah-6l

This model is a fine-tuned version of polejowska/detr-r50-cd45rb-8ah-6l on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.2166

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
3.4044 1.0 115 3.0048
3.1708 2.0 230 2.9028
3.0756 3.0 345 2.8538
2.9769 4.0 460 2.8149
2.8999 5.0 575 2.7332
2.8609 6.0 690 2.7212
2.8338 7.0 805 2.6894
2.8103 8.0 920 2.7045
2.8036 9.0 1035 2.7786
2.7486 10.0 1150 2.6881
2.7076 11.0 1265 2.6059
2.7156 12.0 1380 2.6483
2.6655 13.0 1495 2.5438
2.6368 14.0 1610 2.5342
2.5982 15.0 1725 2.5287
2.6116 16.0 1840 2.4446
2.5592 17.0 1955 2.4365
2.5528 18.0 2070 2.4844
2.5248 19.0 2185 2.4195
2.4853 20.0 2300 2.4538
2.5295 21.0 2415 2.5696
2.5069 22.0 2530 2.4537
2.4504 23.0 2645 2.5152
2.4447 24.0 2760 2.4432
2.4303 25.0 2875 2.4033
2.4137 26.0 2990 2.3796
2.41 27.0 3105 2.3599
2.3816 28.0 3220 2.4018
2.3752 29.0 3335 2.3116
2.3929 30.0 3450 2.3105
2.3791 31.0 3565 2.3677
2.3639 32.0 3680 2.4312
2.3475 33.0 3795 2.3052
2.3429 34.0 3910 2.3222
2.3115 35.0 4025 2.3126
2.3276 36.0 4140 2.3154
2.3126 37.0 4255 2.3534
2.2934 38.0 4370 2.2566
2.2901 39.0 4485 2.2748
2.2622 40.0 4600 2.2620
2.2707 41.0 4715 2.2336
2.2338 42.0 4830 2.2242
2.2457 43.0 4945 2.2192
2.227 44.0 5060 2.2067
2.2215 45.0 5175 2.2183
2.2075 46.0 5290 2.2188
2.2286 47.0 5405 2.2306
2.2292 48.0 5520 2.2160
2.219 49.0 5635 2.2208
2.2125 50.0 5750 2.2166

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.0.0
  • Datasets 2.1.0
  • Tokenizers 0.14.1
Downloads last month
46
Safetensors
Model size
41.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for polejowska/detr-r50-finetuned-mist1-gb-4ah-6l

Finetuned
(2)
this model