Edit model card

results_t5_wiki

This model is a fine-tuned version of ahmeddbahaa/t5-arabic-base-finetuned-wikilingua-ar on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0002
  • Rouge1: 0.1188
  • Rouge2: 0.0194
  • Rougel: 0.1188
  • Rougelsum: 0.1186
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 250
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
0.8768 0.2143 500 0.0228 0.1148 0.0128 0.1148 0.1147 19.0
0.0437 0.4286 1000 0.0111 0.1164 0.0154 0.1168 0.1165 19.0
0.0436 0.6429 1500 0.0060 0.1168 0.0163 0.1171 0.1169 19.0
0.0212 0.8573 2000 0.0052 0.117 0.0165 0.1173 0.117 19.0
0.0161 1.0716 2500 0.0018 0.1188 0.0194 0.1188 0.1186 19.0
0.011 1.2859 3000 0.0018 0.1188 0.0193 0.1188 0.1186 19.0
0.0094 1.5002 3500 0.0014 0.1188 0.0194 0.1188 0.1186 19.0
0.0107 1.7145 4000 0.0007 0.1188 0.0194 0.1188 0.1186 19.0
0.0069 1.9288 4500 0.0006 0.1188 0.0194 0.1188 0.1186 19.0
0.007 2.1432 5000 0.0006 0.1188 0.0194 0.1188 0.1186 19.0
0.0064 2.3575 5500 0.0006 0.1188 0.0194 0.1188 0.1186 19.0
0.0062 2.5718 6000 0.0015 0.1188 0.0194 0.1188 0.1186 19.0
0.0042 2.7861 6500 0.0005 0.1188 0.0194 0.1188 0.1186 19.0
0.0043 3.0004 7000 0.0004 0.1188 0.0194 0.1188 0.1186 19.0
0.0042 3.2147 7500 0.0012 0.1188 0.0194 0.1188 0.1186 19.0
0.0047 3.4291 8000 0.0010 0.1188 0.0194 0.1188 0.1186 19.0
0.0043 3.6434 8500 0.0008 0.1188 0.0194 0.1188 0.1186 19.0
0.0024 3.8577 9000 0.0003 0.1188 0.0194 0.1188 0.1186 19.0
0.0026 4.0720 9500 0.0005 0.1188 0.0194 0.1188 0.1186 19.0
0.0029 4.2863 10000 0.0003 0.1188 0.0194 0.1188 0.1186 19.0
0.0045 4.5006 10500 0.0006 0.1188 0.0194 0.1188 0.1186 19.0
0.0024 4.7150 11000 0.0001 0.1188 0.0194 0.1188 0.1186 19.0
0.0018 4.9293 11500 0.0002 0.1188 0.0194 0.1188 0.1186 19.0
0.002 5.1436 12000 0.0002 0.1188 0.0194 0.1188 0.1186 19.0
0.0022 5.3579 12500 0.0001 0.1188 0.0194 0.1188 0.1186 19.0
0.0017 5.5722 13000 0.0003 0.1188 0.0194 0.1188 0.1186 19.0
0.0014 5.7865 13500 0.0005 0.1188 0.0194 0.1188 0.1186 19.0
0.0055 6.0009 14000 0.0012 0.1188 0.0194 0.1188 0.1186 16.3147
0.0127 6.2152 14500 0.0002 0.1188 0.0194 0.1188 0.1186 19.0
0.0012 6.4295 15000 0.0002 0.1188 0.0194 0.1188 0.1186 19.0

Framework versions

  • Transformers 4.42.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
252M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hiba2/results_t5_wiki

Finetuned
(1)
this model