wikipedia_clm_30
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 4.4528
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100000
- training_steps: 400000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
7.0272 | 1.1170 | 4000 | 6.1887 |
5.8772 | 2.2340 | 8000 | 5.5561 |
5.3173 | 3.3510 | 12000 | 5.1448 |
4.9673 | 4.4680 | 16000 | 4.9279 |
4.7421 | 5.5850 | 20000 | 4.7851 |
4.5756 | 6.7020 | 24000 | 4.6903 |
4.4569 | 7.8190 | 28000 | 4.6298 |
4.365 | 8.9361 | 32000 | 4.5899 |
4.2915 | 10.0531 | 36000 | 4.5644 |
4.2307 | 11.1701 | 40000 | 4.5460 |
4.1914 | 12.2871 | 44000 | 4.5348 |
4.1552 | 13.4041 | 48000 | 4.5072 |
4.1264 | 14.5211 | 52000 | 4.4994 |
4.099 | 15.6381 | 56000 | 4.4783 |
4.0741 | 16.7551 | 60000 | 4.4639 |
4.0551 | 17.8721 | 64000 | 4.4590 |
4.035 | 18.9891 | 68000 | 4.4448 |
4.0063 | 20.1061 | 72000 | 4.4723 |
3.995 | 21.2231 | 76000 | 4.4593 |
3.982 | 22.3401 | 80000 | 4.4528 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.5.1+cu124
- Datasets 3.0.1
- Tokenizers 0.20.1
- Downloads last month
- 0
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.