Edit model card

ptt5-wikilingua-cstnews

This model is a fine-tuned version of arthurmluz/ptt5-wikilingua-30epochs on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2336
  • Rouge1: 0.2757
  • Rouge2: 0.2182
  • Rougel: 0.2534
  • Rougelsum: 0.2727
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 47 1.4403 0.2059 0.1334 0.1769 0.1998 18.6452
No log 2.0 94 1.3259 0.2356 0.1632 0.2052 0.2286 18.871
No log 3.0 141 1.2783 0.244 0.1737 0.215 0.2331 18.871
No log 4.0 188 1.2469 0.2518 0.1866 0.2174 0.2366 18.9355
1.7624 5.0 235 1.2306 0.266 0.1958 0.2321 0.2539 18.9355
1.7624 6.0 282 1.2214 0.2644 0.1991 0.2347 0.2533 18.9355
1.7624 7.0 329 1.2133 0.2603 0.1975 0.2327 0.2505 18.9355
1.7624 8.0 376 1.2076 0.267 0.2058 0.2423 0.2589 18.9355
1.3494 9.0 423 1.2026 0.2698 0.2073 0.2454 0.2643 18.9355
1.3494 10.0 470 1.1997 0.2704 0.2078 0.2457 0.2649 19.0
1.3494 11.0 517 1.2006 0.2762 0.2151 0.2518 0.2736 19.0
1.3494 12.0 564 1.2012 0.2772 0.2163 0.2545 0.2746 19.0
1.1715 13.0 611 1.2017 0.2787 0.2176 0.2555 0.2763 19.0
1.1715 14.0 658 1.2048 0.278 0.2187 0.256 0.2753 19.0
1.1715 15.0 705 1.2063 0.2755 0.2219 0.2579 0.2735 19.0
1.1715 16.0 752 1.2057 0.2768 0.2219 0.2589 0.2748 19.0
1.1715 17.0 799 1.2084 0.2798 0.2244 0.26 0.2783 19.0
1.0497 18.0 846 1.2138 0.2787 0.2258 0.2602 0.2764 19.0
1.0497 19.0 893 1.2177 0.2783 0.2248 0.26 0.2761 19.0
1.0497 20.0 940 1.2166 0.2767 0.2215 0.2583 0.2745 19.0
1.0497 21.0 987 1.2187 0.2738 0.2162 0.2527 0.2715 19.0
0.9439 22.0 1034 1.2196 0.2741 0.2171 0.2531 0.2718 19.0
0.9439 23.0 1081 1.2229 0.2741 0.2171 0.2531 0.2718 19.0
0.9439 24.0 1128 1.2257 0.2757 0.2182 0.2534 0.2727 19.0
0.9439 25.0 1175 1.2292 0.2739 0.2171 0.2525 0.2713 19.0
0.8986 26.0 1222 1.2294 0.2739 0.2171 0.2525 0.2713 19.0
0.8986 27.0 1269 1.2310 0.2767 0.2192 0.2548 0.2741 19.0
0.8986 28.0 1316 1.2325 0.2747 0.218 0.2536 0.2725 19.0
0.8986 29.0 1363 1.2335 0.2739 0.2171 0.2525 0.2713 19.0
0.8805 30.0 1410 1.2336 0.2757 0.2182 0.2534 0.2727 19.0

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for arthurmluz/ptt5-wikilingua-cstnews