AbhilashDatta commited on
Commit
ea0c22d
·
1 Parent(s): 02320a7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -0
README.md CHANGED
@@ -1,3 +1,31 @@
1
  ---
2
  license: afl-3.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: afl-3.0
3
  ---
4
+
5
+ # Question generation using T5 transformer
6
+
7
+ Import the pretrained model as well as tokenizer:
8
+ ```
9
+ from transformers import T5ForConditionalGeneration, T5Tokenizer
10
+
11
+ model = T5ForConditionalGeneration.from_pretrained('AbhilashDatta/T5_Qgen-squad-marco')
12
+ tokenizer = T5Tokenizer.from_pretrained('AbhilashDatta/T5_Qgen-squad-marco')
13
+ ```
14
+
15
+ Then use the tokenizer to encode/decode and model to generate:
16
+ ```
17
+ input = "answer: Abhilash context: My name is Abhilash Datta."
18
+ batch = tokenizer(input, padding='longest', max_length=512, return_tensors='pt')
19
+ inputs_batch = batch['input_ids'][0]
20
+ inputs_batch = torch.unsqueeze(inputs_batch, 0)
21
+
22
+ ques_id = model.generate(inputs_batch, max_length=100, early_stopping=True)
23
+ ques_batch = [tokenizer.decode(g, skip_special_tokens=True, clean_up_tokenization_spaces=False) for g in ques_id]
24
+
25
+ print(ques_batch)
26
+ ```
27
+
28
+ Output:
29
+ ```
30
+ ['what is my name']
31
+ ```