File size: 1,133 Bytes
0cf99d3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/model-cards
{}
---

## Usage

```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

model_id = 'datapaf/fvt_ift_rus'

tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
    model_id,
    torch_dtype=torch.bfloat16,
    device_map='auto'
)

chat = [
 {"role": "system", "content": "Ты AI-помощник, ответь на вопрос"},
 {"role": "user", "content": "Привет! Как дела?"},
]

templated = tokenizer.apply_chat_template(chat, tokenize=False)
encoded = tokenizer(templated, return_tensors="pt",add_special_tokens=True)
inputs = {key: tensor.to(model.device) for key, tensor in encoded.items()}

output = model.generate(
    **inputs,
    max_new_tokens=1024,
    do_sample=False,
    repetition_penalty=1.2
)

decoded_output = tokenizer.decode(
    output[0][inputs['input_ids'].size(1)+2:],
    skip_special_tokens=True
)

print(decoded_output)
```