Base model: gpt2-large
Fine-tuned to generate responses on a dataset of Vaccine public health tweets. For more information about the dataset, task and training, see our paper. This checkpoint corresponds to the lowest validation perplexity (2.82 at 2 epochs) seen during training. See Training metrics for Tensorboard logs.
For input format and usage examples, see our COVID-19 public health tweet response model.
- Downloads last month
- 4
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.