MIXdevAI-yandexGPT5-8B

Модель разработанная путем слияния YandexGPT-5-Lite-8B-instruct (70%) и Saiga YandexGPT 8B (30%).

Быстрый старт

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("Kolyadual/MIXdevAI-yandexGPT5-8B", device_map="auto", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("Kolyadual/MIXdevAI-yandexGPT5-8B")

messages = [{"role": "user", "content": "Привет! Как дела?"}]
inputs = tokenizer(tokenizer.apply_chat_template(messages, tokenize=False), return_tensors="pt").to(model.device)
print(tokenizer.decode(model.generate(**inputs, max_new_tokens=512)[0], skip_special_tokens=True))
Downloads last month
23
Safetensors
Model size
8B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Kolyadual/MIXdevAI-yandexGPT5-8B

Quantizations
2 models

Collection including Kolyadual/MIXdevAI-yandexGPT5-8B