bu modeldan custome .pdf chatbot uchon bemalol foydalan olishingiz mumkin.😃
from peft import AutoPeftModelForCausalLM
from transformers import AutoTokenizer
model = AutoPeftModelForCausalLM.from_pretrained(
"blackhole33/prmopt-cqa-250k-sample",
load_in_4bit = load_in_4bit,
)
tokenizer = AutoTokenizer.from_pretrained("blackhole33/prmopt-cqa-250k-sample")
inputs = tokenizer(
[
"Eng yaxshi kitoblarni qayerdan olish mumkin ?" + "Kitob.uz va Uzum tezkordan juda ham ko'p kitoblar bor, ammo sifatli deb bo'lmaydi. Mensizga ras.books.uz site taklif qilaman, sababi sifatli kitoblarni topsa bo'ladi.",
"", response bu yerda empty bo'ladi.
)
], return_tensors = "pt").to("cuda")
outputs = model.generate(**inputs, max_new_tokens = 64, use_cache = True)
tokenizer.batch_decode(outputs)
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support