File size: 945 Bytes
6a3a844
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2183335
6a3a844
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
---
license: mit
language:
- ru
- en
pipeline_tag: text-generation
inference: false
tags:
- gpt3
- transformers
- pytorch
---
This is a generative model converted to fp16 format based on [ai-forever/ruGPT-3.5-13B](https://huggingface.co/ai-forever/ruGPT-3.5-13B)

## Examples of usage

```python
from transformers import AutoTokenizer
from auto_gptq import AutoGPTQForCausalLM

model = AutoGPTQForCausalLM.from_quantized('Gaivoronsky/ruGPT-3.5-13B-8bit', device="cuda:0", use_triton=False)
tokenizer = AutoTokenizer.from_pretrained('Gaivoronsky/ruGPT-3.5-13B-8bit')

request = "Человек: Сколько весит жираф? Помощник: "
encoded_input = tokenizer(request, return_tensors='pt', \
                          add_special_tokens=False).to('cuda')
output = model.generate(
    **encoded_input,
    num_beams=2,
    do_sample=True,
    max_new_tokens=100
)
print(tokenizer.decode(output[0], skip_special_tokens=True))
```