metadata
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- bleu
base_model: distilbert/distilgpt2
model-index:
- name: distilgpt2-finetuned
results: []
distilgpt2-finetuned
This model is a fine-tuned version of distilbert/distilgpt2 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.7322
- Bleu: 0.0145
- Bertscore Precision: 0.1505
- Bertscore Recall: 0.1674
- Bertscore F1: 0.1581
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu | Bertscore Precision | Bertscore Recall | Bertscore F1 |
---|---|---|---|---|---|---|---|
5.0087 | 1.0 | 3223 | 3.9456 | 0.0088 | 0.1478 | 0.1638 | 0.1551 |
4.8889 | 2.0 | 6446 | 3.7706 | 0.0093 | 0.1480 | 0.1642 | 0.1554 |
4.9152 | 3.0 | 9669 | 3.6252 | 0.0097 | 0.1483 | 0.1646 | 0.1557 |
4.647 | 4.0 | 12892 | 3.5105 | 0.0103 | 0.1486 | 0.1649 | 0.1560 |
4.4683 | 5.0 | 16115 | 3.4093 | 0.0108 | 0.1489 | 0.1652 | 0.1563 |
4.4007 | 6.0 | 19338 | 3.3225 | 0.0110 | 0.1491 | 0.1654 | 0.1565 |
4.3966 | 7.0 | 22561 | 3.2444 | 0.0115 | 0.1493 | 0.1656 | 0.1567 |
4.3414 | 8.0 | 25784 | 3.1662 | 0.0117 | 0.1494 | 0.1657 | 0.1568 |
4.2446 | 9.0 | 29007 | 3.1021 | 0.0122 | 0.1497 | 0.1660 | 0.1571 |
4.2464 | 10.0 | 32230 | 3.0384 | 0.0125 | 0.1499 | 0.1662 | 0.1573 |
4.1739 | 11.0 | 35453 | 2.9789 | 0.0128 | 0.1499 | 0.1665 | 0.1574 |
4.08 | 12.0 | 38676 | 2.9295 | 0.0131 | 0.1501 | 0.1666 | 0.1576 |
4.001 | 13.0 | 41899 | 2.8857 | 0.0135 | 0.1502 | 0.1668 | 0.1577 |
3.9277 | 14.0 | 45122 | 2.8464 | 0.0136 | 0.1502 | 0.1669 | 0.1578 |
3.9709 | 15.0 | 48345 | 2.8137 | 0.0139 | 0.1503 | 0.1670 | 0.1578 |
3.9192 | 16.0 | 51568 | 2.7872 | 0.0141 | 0.1503 | 0.1672 | 0.1579 |
3.8916 | 17.0 | 54791 | 2.7644 | 0.0143 | 0.1504 | 0.1673 | 0.1580 |
3.8489 | 18.0 | 58014 | 2.7475 | 0.0144 | 0.1505 | 0.1674 | 0.1581 |
3.9091 | 19.0 | 61237 | 2.7364 | 0.0145 | 0.1505 | 0.1674 | 0.1581 |
3.9271 | 20.0 | 64460 | 2.7322 | 0.0145 | 0.1505 | 0.1674 | 0.1581 |
Framework versions
- Transformers 4.40.2
- Pytorch 2.3.0+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1