File size: 880 Bytes
ad21597
 
 
 
 
 
 
 
 
3218971
 
 
 
 
ad21597
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
license: apache-2.0
base_model: mistralai/Mistral-7B-Instruct-v0.2
tags:
- lora
- chat
- turkish
- instruct
---
adapter_model.safetensors
adapter_config.json
(optional)
tokenizer.json
README.md

# ChatGPT Benzeri LoRA Model

Bu repo **LoRA ağırlıklarını** içerir.
Base model ayrıca indirilmelidir.

## Kullanım

```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel

base_model = "mistralai/Mistral-7B-Instruct-v0.2"
lora_model = "KULLANICI_ADI/MODEL_ADI"

tokenizer = AutoTokenizer.from_pretrained(base_model)
model = AutoModelForCausalLM.from_pretrained(base_model)
model = PeftModel.from_pretrained(model, lora_model)

prompt = "Merhaba, nasılsın?"
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=200)

print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```