Saumya-Mundra's picture
End of training
2c49081 verified
|
raw
history blame
4.29 kB
metadata
library_name: transformers
license: other
base_model: nvidia/mit-b0
tags:
  - image-segmentation
  - vision
  - generated_from_trainer
model-index:
  - name: segformer-finetuned-tt-1000-2k
    results: []

segformer-finetuned-tt-1000-2k

This model is a fine-tuned version of nvidia/mit-b0 on the Saumya-Mundra/text255 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0976
  • Mean Iou: 0.4895
  • Mean Accuracy: 0.9790
  • Overall Accuracy: 0.9790
  • Accuracy Text: nan
  • Accuracy No Text: 0.9790
  • Iou Text: 0.0
  • Iou No Text: 0.9790

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: polynomial
  • training_steps: 2000

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Text Accuracy No Text Iou Text Iou No Text
0.3719 1.0 125 0.1986 0.4842 0.9684 0.9684 nan 0.9684 0.0 0.9684
0.2348 2.0 250 0.1336 0.4932 0.9864 0.9864 nan 0.9864 0.0 0.9864
0.183 3.0 375 0.1268 0.4874 0.9747 0.9747 nan 0.9747 0.0 0.9747
0.1485 4.0 500 0.1114 0.4901 0.9802 0.9802 nan 0.9802 0.0 0.9802
0.1429 5.0 625 0.1122 0.4878 0.9757 0.9757 nan 0.9757 0.0 0.9757
0.1367 6.0 750 0.1075 0.4917 0.9834 0.9834 nan 0.9834 0.0 0.9834
0.1333 7.0 875 0.1048 0.4897 0.9793 0.9793 nan 0.9793 0.0 0.9793
0.1199 8.0 1000 0.1009 0.4888 0.9776 0.9776 nan 0.9776 0.0 0.9776
0.1201 9.0 1125 0.1000 0.4903 0.9806 0.9806 nan 0.9806 0.0 0.9806
0.1111 10.0 1250 0.0998 0.4904 0.9807 0.9807 nan 0.9807 0.0 0.9807
0.1128 11.0 1375 0.0984 0.4896 0.9792 0.9792 nan 0.9792 0.0 0.9792
0.1055 12.0 1500 0.0941 0.4918 0.9835 0.9835 nan 0.9835 0.0 0.9835
0.0988 13.0 1625 0.0972 0.4907 0.9815 0.9815 nan 0.9815 0.0 0.9815
0.0983 14.0 1750 0.0947 0.4921 0.9843 0.9843 nan 0.9843 0.0 0.9843
0.1045 15.0 1875 0.0960 0.4897 0.9794 0.9794 nan 0.9794 0.0 0.9794
0.1002 16.0 2000 0.0976 0.4895 0.9790 0.9790 nan 0.9790 0.0 0.9790

Framework versions

  • Transformers 4.49.0.dev0
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0