Edit model card

mt5_small_wmt16_de_en

This model is a fine-tuned version of google/mt5-small on the wmt16 dataset. It achieves the following results on the evaluation set:

  • Loss: 2.4612
  • Rouge1: 0.3666
  • Rouge2: 0.147
  • Rougel: 0.3362
  • Sacrebleu: 6.4622

Model description

Multilingual T5 (mT5) is a massively multilingual pretrained text-to-text transformer model, trained following a similar recipe as T5.

Intended uses & limitations

This is tried to be familiarized with the mt5 model in order to use it for the translation of English to Korean.

Training and evaluation data

This work was done as an exercise for English-Korean translation, so I trained by selecting only very small part of a very large original dataset. Therefore, the quality is not expected to be very good. ์ด ์ผ์€ ์˜์–ด ํ•œ๊ตญ์–ด ๋ฒˆ์—ญ์„ ์œ„ํ•œ ์—ฐ์Šต์œผ๋กœ ํ•œ ๊ฒƒ์ด๊ธฐ ๋•Œ๋ฌธ์— ๋งค์šฐ ํฐ ์› dataset์—์„œ ์•„์ฃผ ์ž‘์€ ํฌ๊ธฐ๋งŒ์˜ ๊ธ€๋ญ‰์น˜๋งŒ ์„ ํƒ์„ ํ•ด์„œ ํ›ˆ๋ จ์„ ํ–ˆ๋‹ค. ๋”ฐ๋ผ์„œ ์งˆ์€ ๊ทธ๋ฆฌ ์ข‹์ง€ ์•Š์„ ๊ฒƒ์œผ๋กœ ์˜ˆ์ƒ๋œ๋‹ค.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Sacrebleu
3.3059 1.6 500 2.5597 0.3398 0.1261 0.3068 5.5524
2.4093 3.2 1000 2.4996 0.3609 0.144 0.3304 6.2002
2.2322 4.8 1500 2.4612 0.3666 0.147 0.3362 6.4622

Framework versions

  • Transformers 4.32.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for chunwoolee0/mt5_small_wmt16_de_en

Base model

google/mt5-small
Finetuned
(302)
this model

Dataset used to train chunwoolee0/mt5_small_wmt16_de_en

Evaluation results