voidful/bart-eqg-question-generator
Model description
This model is a sequence-to-sequence question generator with only the context as an input, and generates a question as an output.
It is based on a pretrained bart-base
model, and trained on EQG-RACE corpus.
Intended uses & limitations
The model is trained to generate examinations-style multiple choice question.
How to use
The model takes context as an input sequence, and will generate a question as an output sequence. The max sequence length is 1024 tokens. Inputs should be organised into the following format:
context
The input sequence can then be encoded and passed as the input_ids
argument in the model's generate()
method.
- Downloads last month
- 53
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.