Indonesian GPT-2-medium finetuned on Indonesian poems
This is the Indonesian gpt2-medium model fine-tuned to Indonesian poems. The dataset can be found in here All training was done on Google Colab Jupyter Notebook (soon).
The dataset is splitted into two subset with details belows:
split | count (examples) | percentage |
---|---|---|
train | 7,358 | 80% |
validation | 1,890 | 20% |
Evaluation results
The model evaluation results after 10 epochs are as follows:
dataset | train/loss | eval/loss | eval perplexity |
---|---|---|---|
id puisi | 3.104 | 3.384 | 29.4884 |
The logs can be found in wandb page here
- Downloads last month
- 9
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.