voidful/bart-eqg-question-generator
Model description
This model is a sequence-to-sequence question generator with only the context as an input, and generates a question as an output.
It is based on a pretrained bart-base
model, and trained on EQG-RACE corpus.
Intended uses & limitations
The model is trained to generate examinations-style multiple choice question.
How to use
The model takes context as an input sequence, and will generate a question as an output sequence. The max sequence length is 1024 tokens. Inputs should be organised into the following format:
context
The input sequence can then be encoded and passed as the input_ids
argument in the model's generate()
method.
- Downloads last month
- 124
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.