Edit model card

mt5-small-finetuned-amazon-en-es

This model is a fine-tuned version of google/mt5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.5792
  • Rouge1: 19.178
  • Rouge2: 11.1294
  • Rougel: 18.8056
  • Rougelsum: 18.8857

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 8

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
3.1482 1.0 1301 2.6708 17.2978 11.0922 17.1791 17.1414
2.867 2.0 2602 2.6532 17.7932 10.2988 17.6139 17.6418
2.74 3.0 3903 2.6575 19.2584 11.6796 18.98 19.0057
3.0353 4.0 5204 2.5845 19.1599 11.2723 18.8132 18.8559
2.9691 5.0 6505 2.5820 18.2435 9.5271 17.904 17.9735
2.9221 6.0 7806 2.5784 18.5969 10.5778 18.2837 18.2395
2.8944 7.0 9107 2.5738 18.6871 10.6402 18.4386 18.4199
2.8636 8.0 10408 2.5792 19.178 11.1294 18.8056 18.8857

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
1
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for lakshinav/mt5-small-finetuned-amazon-en-es

Base model

google/mt5-small
Finetuned
this model