GPT
Collection
Series of GPT fine-tuned models
•
6 items
•
Updated
This model is a fine-tuned version of distilgpt2 on the wikitext wikitext-2-raw-v1 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training: