GPT-3 small
Pretrained GPT Neo (GPT-3 small) , it's architecture intentionally resembles that of GPT-3, model was trained on Vietnamese dataset for text generation
How to use the model
from transformers import GPT2Tokenizer, GPTNeoForCausalLM
tokenizer = GPT2Tokenizer.from_pretrained('minhtoan/gpt3-small-vietnamese')
model = GPTNeoForCausalLM.from_pretrained('minhtoan/gpt3-small-vietnamese')
text = "Hoa quả và rau thường rẻ hơn khi vào mùa"
input_ids = tokenizer.encode(text, return_tensors='pt')
max_length = 100
sample_outputs = model.generate(input_ids, do_sample=True, max_length=max_length)
for i, sample_output in enumerate(sample_outputs):
print(">> Generated text {}\n\n{}".format(i+1, tokenizer.decode(sample_output.tolist())))
print('\n---')
Author
Phan Minh Toan
- Downloads last month
- 139
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.