--- license: apache-2.0 base_model: distilgpt2 tags: - generated_from_trainer model-index: - name: my_awesome_eli5_clm-model results: [] --- # my_awesome_eli5_clm-model This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on the None dataset. It achieves the following results on the evaluation set: - Loss: 3.7706 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 4.0085 | 0.13 | 500 | 3.8586 | | 3.9418 | 0.25 | 1000 | 3.8368 | | 3.9257 | 0.38 | 1500 | 3.8236 | | 3.9012 | 0.51 | 2000 | 3.8139 | | 3.9131 | 0.63 | 2500 | 3.8052 | | 3.8947 | 0.76 | 3000 | 3.7976 | | 3.8943 | 0.88 | 3500 | 3.7912 | | 3.8809 | 1.01 | 4000 | 3.7887 | | 3.8243 | 1.14 | 4500 | 3.7877 | | 3.8251 | 1.26 | 5000 | 3.7854 | | 3.822 | 1.39 | 5500 | 3.7824 | | 3.8141 | 1.52 | 6000 | 3.7808 | | 3.8243 | 1.64 | 6500 | 3.7785 | | 3.8108 | 1.77 | 7000 | 3.7762 | | 3.8059 | 1.89 | 7500 | 3.7755 | | 3.7984 | 2.02 | 8000 | 3.7765 | | 3.7866 | 2.15 | 8500 | 3.7747 | | 3.7761 | 2.27 | 9000 | 3.7746 | | 3.7764 | 2.4 | 9500 | 3.7727 | | 3.779 | 2.53 | 10000 | 3.7727 | | 3.7744 | 2.65 | 10500 | 3.7719 | | 3.7685 | 2.78 | 11000 | 3.7708 | | 3.7694 | 2.9 | 11500 | 3.7706 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0