Edit model card

DanSumT5-baseV_38821V_41166

This model is a fine-tuned version of emilstabil/DanSumT5-baseV_38821 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.1413
  • Rouge1: 35.0654
  • Rouge2: 11.6563
  • Rougel: 21.7686
  • Rougelsum: 27.7516
  • Gen Len: 126.3262

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 232 2.1957 34.7339 11.6712 21.4644 27.6817 126.4592
No log 2.0 465 2.1830 34.8759 12.139 21.5278 27.3465 126.4549
2.2462 3.0 697 2.1705 35.3017 12.4909 21.9387 28.2423 126.4807
2.2462 4.0 930 2.1654 34.8508 11.4696 21.4196 27.6267 126.279
2.1581 5.0 1162 2.1613 35.223 12.1452 21.8105 28.3086 126.6094
2.1581 6.0 1395 2.1515 35.5785 12.0532 21.9575 28.5902 126.7082
2.0992 7.0 1627 2.1560 35.1162 11.7299 21.6834 28.0683 126.3562
2.0992 8.0 1860 2.1519 35.286 11.9648 21.8717 28.0591 126.5193
2.0477 9.0 2092 2.1471 34.9886 11.763 21.5827 27.9164 126.5622
2.0477 10.0 2325 2.1454 35.23 11.9011 21.891 28.0888 126.2403
1.9999 11.0 2557 2.1462 35.2311 12.1353 22.1785 28.2209 126.1803
1.9999 12.0 2790 2.1411 35.0426 11.81 21.9802 28.0833 126.515
1.9791 13.0 3022 2.1417 34.8836 11.419 21.6238 27.6304 126.6738
1.9791 14.0 3255 2.1459 35.0771 11.8678 21.9378 27.9312 126.2918
1.9791 15.0 3487 2.1409 34.9493 11.9437 21.8772 28.0146 126.3562
1.9495 16.0 3720 2.1411 35.1092 11.8562 21.9693 28.0417 126.1502
1.9495 17.0 3952 2.1408 35.3591 12.0079 22.0824 28.0746 126.3176
1.9391 18.0 4185 2.1414 35.1091 11.904 21.9597 27.9814 126.1373
1.9391 19.0 4417 2.1422 35.2336 12.013 22.0223 27.8814 126.3004
1.9139 19.96 4640 2.1413 35.0654 11.6563 21.7686 27.7516 126.3262

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for emilstabil/DanSumT5-baseV_38821V_41166

Finetuned
(1)
this model
Finetunes
2 models